Apr 17 17:25:04.209741 ip-10-0-131-192 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:25:04.665005 ip-10-0-131-192 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:04.665005 ip-10-0-131-192 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:25:04.665005 ip-10-0-131-192 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:04.665005 ip-10-0-131-192 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:25:04.665005 ip-10-0-131-192 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:04.666579 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.666488 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:25:04.671209 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671187 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:04.671209 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671207 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:04.671209 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671211 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671214 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671219 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671222 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671224 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671227 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671230 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671232 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671235 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671238 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671240 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671244 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671247 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671249 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671252 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671254 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671257 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671260 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671263 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671266 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:04.671310 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671269 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671274 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671278 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671281 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671284 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671286 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671289 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671292 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671295 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671298 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671300 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671302 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671307 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671310 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671314 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671316 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671319 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671322 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671324 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:04.671840 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671327 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671330 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671332 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671335 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671337 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671340 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671343 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671345 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671349 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671352 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671355 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671358 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671360 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671363 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671366 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671368 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671371 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671373 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671376 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671379 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:04.672568 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671381 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671384 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671386 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671389 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671392 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671396 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671399 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671402 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671405 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671408 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671411 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671413 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671416 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671418 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671421 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671437 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671441 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671444 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671448 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671452 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:04.673228 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671456 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:04.673753 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671459 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:04.673753 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671462 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:04.673753 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671465 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:04.673753 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.671467 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:04.674056 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674038 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:04.674056 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674054 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:04.674056 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674058 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674061 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674064 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674067 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674069 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674072 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674077 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674081 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674084 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674086 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674089 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674093 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674096 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674098 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674101 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674104 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674106 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674108 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674111 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674113 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:04.674134 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674116 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674118 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674121 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674124 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674126 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674133 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674136 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674139 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674144 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674149 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674152 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674155 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674157 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674160 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674162 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674165 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674167 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674170 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674172 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:04.674660 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674175 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674177 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674180 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674183 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674185 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674188 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674191 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674193 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674196 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674198 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674201 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674203 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674206 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674209 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674212 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674216 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674219 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674221 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674224 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674227 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:04.675135 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674230 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674232 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674235 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674237 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674240 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674242 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674245 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674247 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674250 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674252 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674255 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674257 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674260 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674262 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674264 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674267 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674269 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674272 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674275 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:04.675685 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674277 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674280 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674283 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674285 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674288 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.674290 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674373 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674381 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674388 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674393 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674398 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674401 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674406 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674411 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674414 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674417 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674434 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674440 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674445 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674449 2573 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674453 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674455 2573 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674458 2573 flags.go:64] FLAG: --cloud-config="" Apr 17 17:25:04.676140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674461 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674464 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674468 2573 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674471 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674474 2573 flags.go:64] FLAG: --config-dir="" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674477 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674481 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674485 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674489 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674492 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674495 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674498 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674501 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674504 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674507 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674510 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674514 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674518 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674521 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674524 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674527 2573 flags.go:64] FLAG: --enable-server="true" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674530 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674534 2573 flags.go:64] FLAG: --event-burst="100" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674537 2573 flags.go:64] FLAG: --event-qps="50" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674540 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:25:04.676711 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674543 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674546 2573 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674550 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674553 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674556 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674559 2573 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674562 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674565 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674568 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674571 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674573 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674576 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674579 2573 flags.go:64] FLAG: --feature-gates="" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674583 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674586 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674589 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674593 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674596 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674599 2573 flags.go:64] FLAG: --help="false" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674602 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-131-192.ec2.internal" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674605 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674608 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674616 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674620 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:25:04.677362 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674624 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674627 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674630 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674633 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674636 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674639 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674643 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674646 2573 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674649 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674651 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674655 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674657 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674660 2573 flags.go:64] FLAG: --lock-file="" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674663 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674666 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674670 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674675 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674678 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674681 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674684 2573 flags.go:64] FLAG: --logging-format="text" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674687 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674690 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674693 2573 flags.go:64] FLAG: --manifest-url="" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674696 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674701 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:25:04.677962 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674704 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674709 2573 flags.go:64] FLAG: --max-pods="110" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674712 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674715 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674718 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674721 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674725 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674728 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674731 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674739 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674742 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674745 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674748 2573 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674751 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674756 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674759 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674762 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674766 2573 flags.go:64] FLAG: --port="10250" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674769 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674772 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07e1aa953fff73296" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674775 2573 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674778 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674781 2573 flags.go:64] FLAG: --register-node="true" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674783 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:25:04.678607 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674786 2573 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674790 2573 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674793 2573 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674796 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674798 2573 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674802 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674805 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674808 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674811 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674819 2573 flags.go:64] FLAG: --runonce="false" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674822 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674825 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674828 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674832 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674835 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674838 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674842 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674845 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674848 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674851 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674854 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674857 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674860 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674863 2573 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674866 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:25:04.679175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674872 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674875 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674877 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674882 2573 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674885 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674887 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674890 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674893 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674897 2573 flags.go:64] FLAG: --v="2" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674901 2573 flags.go:64] FLAG: --version="false" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674905 2573 flags.go:64] FLAG: --vmodule="" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674909 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.674913 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675009 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675012 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675015 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675018 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675021 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675023 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675026 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675030 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675033 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675035 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:04.679790 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675038 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675040 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675043 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675046 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675048 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675050 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675053 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675056 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675058 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675062 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675066 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675070 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675073 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675076 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675079 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675081 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675084 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675087 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675089 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:04.680343 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675092 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675095 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675098 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675100 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675103 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675105 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675108 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675110 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675113 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675116 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675119 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675122 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675125 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675127 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675130 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675133 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675135 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675138 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675141 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675144 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:04.680869 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675146 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675149 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675152 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675154 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675156 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675160 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675162 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675165 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675167 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675170 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675172 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675174 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675177 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675179 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675182 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675184 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675187 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675189 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675192 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675195 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:04.681373 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675197 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675201 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675205 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675207 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675210 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675213 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675215 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675218 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675220 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675223 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675226 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675228 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675231 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675233 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675236 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675238 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:04.681884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.675241 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:04.682285 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.675872 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:04.682495 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.682470 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:25:04.682532 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.682495 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:25:04.682564 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682551 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:04.682564 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682557 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:04.682564 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682561 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:04.682564 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682564 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682567 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682570 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682573 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682577 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682580 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682583 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682586 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682588 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682591 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682594 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682597 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682600 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682603 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682605 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682609 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682612 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682614 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682618 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682620 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:04.682670 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682623 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682625 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682629 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682631 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682634 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682637 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682640 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682642 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682645 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682648 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682650 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682653 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682655 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682658 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682662 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682665 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682667 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682670 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682673 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682676 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:04.683160 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682680 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682686 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682690 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682694 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682697 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682700 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682702 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682705 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682708 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682710 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682713 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682716 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682719 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682721 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682724 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682727 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682729 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682732 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682734 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:04.683677 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682737 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682739 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682742 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682745 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682748 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682750 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682753 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682756 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682759 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682762 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682764 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682767 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682769 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682772 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682774 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682777 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682780 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682783 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682786 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682788 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:04.684147 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682791 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682794 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682797 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682799 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.682805 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682916 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682921 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682924 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682927 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682930 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682933 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682935 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682938 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682941 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682943 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682946 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:04.684650 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682949 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682951 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682955 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682957 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682960 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682963 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682965 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682968 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682970 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682973 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682975 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682978 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682981 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682983 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682986 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682989 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682991 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682994 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682997 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.682999 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:04.685055 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683002 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683005 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683007 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683010 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683012 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683015 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683017 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683020 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683022 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683025 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683027 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683030 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683033 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683036 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683039 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683041 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683044 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683047 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683049 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:04.685646 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683052 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683054 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683057 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683060 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683062 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683065 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683068 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683070 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683073 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683076 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683079 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683081 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683083 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683086 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683089 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683093 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683096 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683099 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683102 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:04.686164 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683105 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683108 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683111 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683113 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683116 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683118 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683122 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683124 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683127 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683130 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683133 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683136 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683139 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683141 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683144 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683146 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:04.686654 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:04.683149 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:04.687049 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.683155 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:04.687049 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.683932 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:25:04.687731 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.687714 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:25:04.688709 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.688693 2573 server.go:1019] "Starting client certificate rotation" Apr 17 17:25:04.688815 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.688795 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:25:04.688849 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.688835 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:25:04.713336 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.713312 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:25:04.715808 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.715783 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:25:04.734162 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.734132 2573 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:25:04.740296 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.740270 2573 log.go:25] "Validated CRI v1 image API" Apr 17 17:25:04.743056 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.743034 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:25:04.746268 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.746246 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:25:04.746889 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.746867 2573 fs.go:135] Filesystem UUIDs: map[32aa26d4-ef74-4bb8-8093-73d8c97cfff1:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ad7fb6d8-fb50-4cc8-ba53-8ef52a574d4c:/dev/nvme0n1p4] Apr 17 17:25:04.746936 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.746889 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:25:04.752257 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.752138 2573 manager.go:217] Machine: {Timestamp:2026-04-17 17:25:04.750685407 +0000 UTC m=+0.418693976 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3113128 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25d9eb7bcd84f5341fa6c82161549c SystemUUID:ec25d9eb-7bcd-84f5-341f-a6c82161549c BootID:b78d0e82-2f74-434b-b3a3-1873b0104b7e Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cb:af:18:3f:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cb:af:18:3f:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a6:1c:26:cb:9a:5d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:25:04.752257 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.752249 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:25:04.752374 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.752339 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:25:04.752710 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.752685 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:25:04.752851 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.752712 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-192.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:25:04.752898 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.752864 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:25:04.752898 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.752873 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:25:04.753552 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.753541 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:25:04.753588 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.753560 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:25:04.755157 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.755144 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:25:04.755279 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.755269 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:25:04.758078 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.758065 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:25:04.758126 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.758082 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:25:04.758955 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.758944 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:25:04.759002 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.758959 2573 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:25:04.759002 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.758969 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:25:04.760027 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.760014 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:25:04.760083 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.760034 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:25:04.763361 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.763342 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:25:04.765269 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.765251 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:25:04.766760 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766746 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766766 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766776 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766784 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766793 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766802 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766810 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766818 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766831 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:25:04.766838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766840 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:25:04.767112 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766867 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:25:04.767112 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.766881 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:25:04.768657 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.768645 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:25:04.768714 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.768663 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:25:04.769796 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.769752 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:25:04.769796 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.769754 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-192.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:25:04.771286 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.771262 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t44mj" Apr 17 17:25:04.771379 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.771357 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-192.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:25:04.772348 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.772331 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:25:04.772453 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.772381 2573 server.go:1295] "Started kubelet" Apr 17 17:25:04.772850 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.772459 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:25:04.773157 ip-10-0-131-192 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:25:04.773312 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.772556 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:25:04.773312 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.773242 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:25:04.778087 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.777950 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:25:04.779132 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.779104 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t44mj" Apr 17 17:25:04.779984 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.779962 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:25:04.783337 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.783309 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:25:04.783695 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.783676 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:25:04.783695 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.783684 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:25:04.784377 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784358 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:25:04.784377 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784361 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:25:04.784549 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784384 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:25:04.784549 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784470 2573 factory.go:55] Registering systemd factory Apr 17 17:25:04.784549 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784500 2573 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:25:04.784549 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784521 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:25:04.784549 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784533 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:25:04.784758 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.784612 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:04.785095 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784922 2573 factory.go:153] Registering CRI-O factory Apr 17 17:25:04.785095 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784937 2573 factory.go:223] Registration of the crio container factory successfully Apr 17 17:25:04.785095 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.784990 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:25:04.785095 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.785008 2573 factory.go:103] Registering Raw factory Apr 17 17:25:04.785095 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.785024 2573 manager.go:1196] Started watching for new ooms in manager Apr 17 17:25:04.785674 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.785660 2573 manager.go:319] Starting recovery of all containers Apr 17 17:25:04.792806 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.792773 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:04.793781 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.793757 2573 manager.go:324] Recovery completed Apr 17 17:25:04.795384 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.795244 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-192.ec2.internal\" not found" node="ip-10-0-131-192.ec2.internal" Apr 17 17:25:04.798826 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.798813 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:04.803947 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.803928 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:04.804029 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.803964 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:04.804029 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.803981 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:04.804599 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.804583 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:25:04.804681 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.804596 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:25:04.804681 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.804621 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:25:04.806853 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.806840 2573 policy_none.go:49] "None policy: Start" Apr 17 17:25:04.806898 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.806857 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:25:04.806898 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.806867 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:25:04.840176 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.840157 2573 manager.go:341] "Starting Device Plugin manager" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.840201 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.840213 2573 server.go:85] "Starting device plugin registration server" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.840526 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.840542 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.840684 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.840762 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.840771 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.841168 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:25:04.857673 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.841209 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:04.918283 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.918192 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:25:04.919461 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.919440 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:25:04.919577 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.919475 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:25:04.919577 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.919495 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:25:04.919577 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.919502 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:25:04.919577 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.919544 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:25:04.922685 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.922664 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:04.941572 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.941547 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:04.942459 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.942441 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:04.942538 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.942475 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:04.942538 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.942486 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:04.942538 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.942512 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-192.ec2.internal" Apr 17 17:25:04.948662 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:04.948646 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-192.ec2.internal" Apr 17 17:25:04.948712 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.948670 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-192.ec2.internal\": node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:04.967174 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:04.967145 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.020678 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.020634 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal"] Apr 17 17:25:05.020757 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.020726 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:05.022808 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.022794 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:05.022892 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.022823 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:05.022892 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.022840 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:05.025293 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.025279 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:05.025459 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.025445 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.025508 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.025475 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:05.026070 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.026055 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:05.026179 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.026076 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:05.026179 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.026089 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:05.026179 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.026142 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:05.026179 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.026169 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:05.026359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.026186 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:05.028634 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.028619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.028729 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.028650 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:05.029334 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.029318 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:05.029446 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.029348 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:05.029446 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.029361 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:05.053499 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.053476 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-192.ec2.internal\" not found" node="ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.057992 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.057971 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-192.ec2.internal\" not found" node="ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.068259 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.068226 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.085822 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.085794 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7bd544cd73c740ecb462759975bc68e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal\" (UID: \"7bd544cd73c740ecb462759975bc68e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.085905 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.085827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7bd544cd73c740ecb462759975bc68e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal\" (UID: \"7bd544cd73c740ecb462759975bc68e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.085905 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.085848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e647b5a59d131d511f2d693e03bddab4-config\") pod \"kube-apiserver-proxy-ip-10-0-131-192.ec2.internal\" (UID: \"e647b5a59d131d511f2d693e03bddab4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.169002 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.168926 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.186372 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.186350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7bd544cd73c740ecb462759975bc68e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal\" (UID: \"7bd544cd73c740ecb462759975bc68e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.186443 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.186386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7bd544cd73c740ecb462759975bc68e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal\" (UID: \"7bd544cd73c740ecb462759975bc68e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.186477 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.186460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7bd544cd73c740ecb462759975bc68e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal\" (UID: \"7bd544cd73c740ecb462759975bc68e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.186509 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.186482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e647b5a59d131d511f2d693e03bddab4-config\") pod \"kube-apiserver-proxy-ip-10-0-131-192.ec2.internal\" (UID: \"e647b5a59d131d511f2d693e03bddab4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.186584 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.186545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e647b5a59d131d511f2d693e03bddab4-config\") pod \"kube-apiserver-proxy-ip-10-0-131-192.ec2.internal\" (UID: \"e647b5a59d131d511f2d693e03bddab4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.186584 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.186559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7bd544cd73c740ecb462759975bc68e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal\" (UID: \"7bd544cd73c740ecb462759975bc68e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.269786 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.269744 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.355279 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.355252 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.359752 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.359733 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.370483 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.370458 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.471057 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.470968 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.571598 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.571560 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.672216 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.672179 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.688600 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.688567 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:25:05.688753 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.688736 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:25:05.688788 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.688766 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:25:05.772314 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:05.772235 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-192.ec2.internal\" not found" Apr 17 17:25:05.781111 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.781069 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:20:04 +0000 UTC" deadline="2028-01-31 17:22:34.443377242 +0000 UTC" Apr 17 17:25:05.781111 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.781106 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15695h57m28.66227672s" Apr 17 17:25:05.784235 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.784202 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:25:05.800037 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.800010 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:25:05.831499 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.831476 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ttqzk" Apr 17 17:25:05.839888 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.839845 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ttqzk" Apr 17 17:25:05.839888 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.839856 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:05.840613 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.840586 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:05.884729 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.884697 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.895498 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.895471 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:25:05.896387 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.896370 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" Apr 17 17:25:05.907753 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.907726 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:25:05.964379 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:05.964333 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd544cd73c740ecb462759975bc68e0.slice/crio-e9b62adb3b805bd28739eabc98945b233292282f962b2ab19433bc1a3a0060fe WatchSource:0}: Error finding container e9b62adb3b805bd28739eabc98945b233292282f962b2ab19433bc1a3a0060fe: Status 404 returned error can't find the container with id e9b62adb3b805bd28739eabc98945b233292282f962b2ab19433bc1a3a0060fe Apr 17 17:25:05.964887 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:05.964867 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode647b5a59d131d511f2d693e03bddab4.slice/crio-7f401d6e9b2d69c1eb9fc3e9b30f60a15811ca84bc092f305ac6cb88930b63e6 WatchSource:0}: Error finding container 7f401d6e9b2d69c1eb9fc3e9b30f60a15811ca84bc092f305ac6cb88930b63e6: Status 404 returned error can't find the container with id 7f401d6e9b2d69c1eb9fc3e9b30f60a15811ca84bc092f305ac6cb88930b63e6 Apr 17 17:25:05.968214 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:05.968199 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:25:06.760578 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.760548 2573 apiserver.go:52] "Watching apiserver" Apr 17 17:25:06.768597 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.768571 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:25:06.768919 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.768897 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d","openshift-cluster-node-tuning-operator/tuned-pj447","openshift-dns/node-resolver-plmjc","openshift-image-registry/node-ca-gcsv8","openshift-multus/multus-additional-cni-plugins-8zj4h","openshift-multus/network-metrics-daemon-6hw86","openshift-network-diagnostics/network-check-target-56d9d","kube-system/konnectivity-agent-c87qk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal","openshift-multus/multus-pgqq4","openshift-network-operator/iptables-alerter-j82wk","openshift-ovn-kubernetes/ovnkube-node-hr974"] Apr 17 17:25:06.771131 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.771113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:06.771217 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:06.771196 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:06.773180 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.773163 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:06.773262 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:06.773242 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:06.776467 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.776356 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.778654 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.778626 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:06.778782 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.778693 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5x4pc\"" Apr 17 17:25:06.778782 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.778728 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:06.780640 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.780619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.780766 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.780709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.782932 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.782912 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:25:06.783053 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.783007 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:25:06.783053 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.783018 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:25:06.783053 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.783026 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.783215 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.783144 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vlcl6\"" Apr 17 17:25:06.783215 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.783160 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:25:06.783215 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.783164 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:25:06.783389 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.783369 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zpnn2\"" Apr 17 17:25:06.785452 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.785337 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:25:06.785805 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.785627 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:25:06.785805 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.785652 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6t24b\"" Apr 17 17:25:06.785805 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.785709 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:25:06.786014 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.785838 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:25:06.786014 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.785850 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:25:06.786876 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.786859 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.788854 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.788838 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:25:06.789404 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.789153 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:25:06.789404 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.789194 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-b6dcv\"" Apr 17 17:25:06.789404 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.789222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:06.789404 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.789267 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:25:06.791414 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.791254 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:25:06.791523 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.791511 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.791584 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.791570 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:25:06.791644 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.791632 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-f9gp2\"" Apr 17 17:25:06.793675 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.793655 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:25:06.793766 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.793689 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-v4g62\"" Apr 17 17:25:06.793982 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.793965 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:06.795720 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3c913c62-8158-4429-8205-ec0b912f3e95-serviceca\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.795820 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795734 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.795820 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.795923 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795822 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-socket-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.795923 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-etc-selinux\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.795923 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/716e40c1-df03-46db-92f3-31f34b85f083-etc-tuned\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.796038 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795930 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-registration-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.796038 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-sys-fs\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.796038 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.795987 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-lib-modules\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.796038 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzx88\" (UniqueName: \"kubernetes.io/projected/54c39df0-963a-429e-b7e9-1cf754453932-kube-api-access-rzx88\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:06.796038 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c913c62-8158-4429-8205-ec0b912f3e95-host\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.796231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796056 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-system-cni-dir\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.796231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ftsh\" (UniqueName: \"kubernetes.io/projected/3c913c62-8158-4429-8205-ec0b912f3e95-kube-api-access-8ftsh\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.796231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796114 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-kubernetes\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.796231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796138 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rm4v\" (UniqueName: \"kubernetes.io/projected/3ff89a2c-d122-45ca-be53-8716d7af6f26-kube-api-access-2rm4v\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.796231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-modprobe-d\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.796231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5gqk\" (UniqueName: \"kubernetes.io/projected/716e40c1-df03-46db-92f3-31f34b85f083-kube-api-access-k5gqk\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.796575 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796234 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-hosts-file\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.796575 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796258 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-tmp-dir\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.796575 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-cnibin\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.796575 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-host\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.796575 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796439 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.797014 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796872 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-os-release\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.797014 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.796955 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-cni-binary-copy\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.797118 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797055 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.797118 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysctl-d\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.797200 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-device-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.797200 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysctl-conf\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.797200 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797176 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:06.797343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797222 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.797343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797262 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-systemd\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.797455 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-sys\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.797455 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-var-lib-kubelet\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.797455 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:06.797609 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4pjm\" (UniqueName: \"kubernetes.io/projected/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-kube-api-access-m4pjm\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.797609 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797500 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62q8w\" (UniqueName: \"kubernetes.io/projected/5b573982-e564-43dc-809a-f117e117fa31-kube-api-access-62q8w\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.797609 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysconfig\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.797609 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-run\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.797609 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797586 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/716e40c1-df03-46db-92f3-31f34b85f083-tmp\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.797840 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797753 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sp75x\"" Apr 17 17:25:06.797840 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797778 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:06.797932 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.797887 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:06.798060 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.798041 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:25:06.799232 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.799215 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:25:06.799532 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.799509 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:25:06.799635 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.799599 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:25:06.799705 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.799633 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:25:06.799852 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.799834 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:25:06.799996 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.799875 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zqhdk\"" Apr 17 17:25:06.799996 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.799907 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:25:06.840518 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.840487 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:20:05 +0000 UTC" deadline="2027-12-03 23:46:37.363401373 +0000 UTC" Apr 17 17:25:06.840518 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.840514 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14286h21m30.522889321s" Apr 17 17:25:06.885483 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.885454 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:25:06.897958 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.897919 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-systemd-units\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.898141 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.897968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-netns\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.898141 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-kubernetes\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.898141 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rm4v\" (UniqueName: \"kubernetes.io/projected/3ff89a2c-d122-45ca-be53-8716d7af6f26-kube-api-access-2rm4v\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.898141 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-modprobe-d\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.898141 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5gqk\" (UniqueName: \"kubernetes.io/projected/716e40c1-df03-46db-92f3-31f34b85f083-kube-api-access-k5gqk\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.898391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-hosts-file\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.898391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-tmp-dir\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.898391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-cnibin\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.898391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-modprobe-d\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.898391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-kubernetes\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.898391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898289 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-run-netns\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.898391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898364 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-os-release\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.898391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-hosts-file\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898401 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898461 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-run-ovn-kubernetes\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-os-release\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl9hg\" (UniqueName: \"kubernetes.io/projected/7b50982c-98df-4df3-9669-7a741ea95eb6-kube-api-access-cl9hg\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-cnibin\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898620 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d7db7d0e-fe44-4b7a-9633-00ced1914759-konnectivity-ca\") pod \"konnectivity-agent-c87qk\" (UID: \"d7db7d0e-fe44-4b7a-9633-00ced1914759\") " pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-cni-bin\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-kubelet\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysctl-conf\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898719 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-tmp-dir\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898738 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-log-socket\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.898764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-cni-multus\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysctl-conf\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898875 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-systemd\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-sys\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-var-lib-kubelet\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62q8w\" (UniqueName: \"kubernetes.io/projected/5b573982-e564-43dc-809a-f117e117fa31-kube-api-access-62q8w\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/716e40c1-df03-46db-92f3-31f34b85f083-tmp\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-sys\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898943 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-systemd\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.898975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-var-lib-kubelet\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-kubelet\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-ovn\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-os-release\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-socket-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-etc-selinux\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.899359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/716e40c1-df03-46db-92f3-31f34b85f083-etc-tuned\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-ovnkube-config\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899194 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-socket-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-cni-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899234 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-etc-kubernetes\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899238 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-etc-selinux\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899284 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gffxg\" (UniqueName: \"kubernetes.io/projected/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-kube-api-access-gffxg\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-lib-modules\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzx88\" (UniqueName: \"kubernetes.io/projected/54c39df0-963a-429e-b7e9-1cf754453932-kube-api-access-rzx88\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899360 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c913c62-8158-4429-8205-ec0b912f3e95-host\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-slash\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-etc-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899482 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-socket-dir-parent\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3c913c62-8158-4429-8205-ec0b912f3e95-host\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ftsh\" (UniqueName: \"kubernetes.io/projected/3c913c62-8158-4429-8205-ec0b912f3e95-kube-api-access-8ftsh\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-lib-modules\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.900253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-cni-bin\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d7db7d0e-fe44-4b7a-9633-00ced1914759-agent-certs\") pod \"konnectivity-agent-c87qk\" (UID: \"d7db7d0e-fe44-4b7a-9633-00ced1914759\") " pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-cnibin\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzgzc\" (UniqueName: \"kubernetes.io/projected/f717fb0f-c30a-4760-8df6-5eb06082af1b-kube-api-access-zzgzc\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899635 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-cni-binary-copy\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-host\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-cni-binary-copy\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899776 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-system-cni-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-host\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899857 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-multus-certs\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysctl-d\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.899973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-device-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900034 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-systemd\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900347 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-cni-binary-copy\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-ovnkube-script-lib\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.901107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5b573982-e564-43dc-809a-f117e117fa31-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:06.900548 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-device-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:06.900648 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs podName:54c39df0-963a-429e-b7e9-1cf754453932 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.400600993 +0000 UTC m=+3.068609766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs") pod "network-metrics-daemon-6hw86" (UID: "54c39df0-963a-429e-b7e9-1cf754453932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900647 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysctl-d\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900671 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4pjm\" (UniqueName: \"kubernetes.io/projected/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-kube-api-access-m4pjm\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900705 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f717fb0f-c30a-4760-8df6-5eb06082af1b-iptables-alerter-script\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b50982c-98df-4df3-9669-7a741ea95eb6-ovn-node-metrics-cert\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-k8s-cni-cncf-io\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900801 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-daemon-config\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysconfig\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900890 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-run\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3c913c62-8158-4429-8205-ec0b912f3e95-serviceca\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900953 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-var-lib-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-cni-netd\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-etc-sysconfig\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.901856 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.900990 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/716e40c1-df03-46db-92f3-31f34b85f083-run\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901003 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-env-overrides\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-hostroot\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901065 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f717fb0f-c30a-4760-8df6-5eb06082af1b-host-slash\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-registration-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-sys-fs\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-system-cni-dir\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-registration-dir\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-node-log\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ff89a2c-d122-45ca-be53-8716d7af6f26-sys-fs\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901381 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-conf-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3c913c62-8158-4429-8205-ec0b912f3e95-serviceca\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.902418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.901404 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b573982-e564-43dc-809a-f117e117fa31-system-cni-dir\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.903037 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.902840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/716e40c1-df03-46db-92f3-31f34b85f083-etc-tuned\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.903037 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.902917 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/716e40c1-df03-46db-92f3-31f34b85f083-tmp\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.907929 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:06.907902 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:06.907929 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:06.907929 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:06.908118 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:06.907942 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6vkh2 for pod openshift-network-diagnostics/network-check-target-56d9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:06.908118 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:06.908001 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2 podName:3dbf031f-03a8-4194-a694-20fe7307d30f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:07.407981578 +0000 UTC m=+3.075990124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6vkh2" (UniqueName: "kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2") pod "network-check-target-56d9d" (UID: "3dbf031f-03a8-4194-a694-20fe7307d30f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:06.910403 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.910356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ftsh\" (UniqueName: \"kubernetes.io/projected/3c913c62-8158-4429-8205-ec0b912f3e95-kube-api-access-8ftsh\") pod \"node-ca-gcsv8\" (UID: \"3c913c62-8158-4429-8205-ec0b912f3e95\") " pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:06.910555 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.910447 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4pjm\" (UniqueName: \"kubernetes.io/projected/9cfa6ba8-721d-4b42-963e-828ffe17cdbd-kube-api-access-m4pjm\") pod \"node-resolver-plmjc\" (UID: \"9cfa6ba8-721d-4b42-963e-828ffe17cdbd\") " pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:06.910555 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.910516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzx88\" (UniqueName: \"kubernetes.io/projected/54c39df0-963a-429e-b7e9-1cf754453932-kube-api-access-rzx88\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:06.910664 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.910612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rm4v\" (UniqueName: \"kubernetes.io/projected/3ff89a2c-d122-45ca-be53-8716d7af6f26-kube-api-access-2rm4v\") pod \"aws-ebs-csi-driver-node-4225d\" (UID: \"3ff89a2c-d122-45ca-be53-8716d7af6f26\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:06.910762 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.910741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5gqk\" (UniqueName: \"kubernetes.io/projected/716e40c1-df03-46db-92f3-31f34b85f083-kube-api-access-k5gqk\") pod \"tuned-pj447\" (UID: \"716e40c1-df03-46db-92f3-31f34b85f083\") " pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:06.911028 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.911002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62q8w\" (UniqueName: \"kubernetes.io/projected/5b573982-e564-43dc-809a-f117e117fa31-kube-api-access-62q8w\") pod \"multus-additional-cni-plugins-8zj4h\" (UID: \"5b573982-e564-43dc-809a-f117e117fa31\") " pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:06.924331 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.924268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" event={"ID":"e647b5a59d131d511f2d693e03bddab4","Type":"ContainerStarted","Data":"7f401d6e9b2d69c1eb9fc3e9b30f60a15811ca84bc092f305ac6cb88930b63e6"} Apr 17 17:25:06.925390 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:06.925362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" event={"ID":"7bd544cd73c740ecb462759975bc68e0","Type":"ContainerStarted","Data":"e9b62adb3b805bd28739eabc98945b233292282f962b2ab19433bc1a3a0060fe"} Apr 17 17:25:07.002365 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-slash\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-etc-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002413 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-socket-dir-parent\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-cni-bin\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-slash\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002462 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d7db7d0e-fe44-4b7a-9633-00ced1914759-agent-certs\") pod \"konnectivity-agent-c87qk\" (UID: \"d7db7d0e-fe44-4b7a-9633-00ced1914759\") " pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-cni-bin\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-cnibin\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-socket-dir-parent\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002544 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-etc-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.002573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzgzc\" (UniqueName: \"kubernetes.io/projected/f717fb0f-c30a-4760-8df6-5eb06082af1b-kube-api-access-zzgzc\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-cnibin\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-cni-binary-copy\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-system-cni-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-multus-certs\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-systemd\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-ovnkube-script-lib\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-system-cni-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f717fb0f-c30a-4760-8df6-5eb06082af1b-iptables-alerter-script\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-systemd\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b50982c-98df-4df3-9669-7a741ea95eb6-ovn-node-metrics-cert\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-k8s-cni-cncf-io\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002857 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-daemon-config\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002714 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-multus-certs\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-var-lib-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-cni-netd\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-env-overrides\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-hostroot\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.002994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f717fb0f-c30a-4760-8df6-5eb06082af1b-host-slash\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003028 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-var-lib-openvswitch\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-node-log\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-conf-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003078 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-k8s-cni-cncf-io\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-systemd-units\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-netns\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-run-netns\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-run-ovn-kubernetes\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003225 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl9hg\" (UniqueName: \"kubernetes.io/projected/7b50982c-98df-4df3-9669-7a741ea95eb6-kube-api-access-cl9hg\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d7db7d0e-fe44-4b7a-9633-00ced1914759-konnectivity-ca\") pod \"konnectivity-agent-c87qk\" (UID: \"d7db7d0e-fe44-4b7a-9633-00ced1914759\") " pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-cni-bin\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-cni-binary-copy\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-systemd-units\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-run-netns\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.003824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-run-netns\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-ovnkube-script-lib\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-run-ovn-kubernetes\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-kubelet\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003491 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-kubelet\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003463 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-cni-netd\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-log-socket\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-log-socket\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-cni-multus\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f717fb0f-c30a-4760-8df6-5eb06082af1b-host-slash\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-kubelet\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-ovn\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-kubelet\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-os-release\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-cni-multus\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-ovnkube-config\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003691 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-cni-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-env-overrides\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.004671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003703 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-run-ovn\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-host-var-lib-cni-bin\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-etc-kubernetes\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003742 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f717fb0f-c30a-4760-8df6-5eb06082af1b-iptables-alerter-script\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003745 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7b50982c-98df-4df3-9669-7a741ea95eb6-node-log\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gffxg\" (UniqueName: \"kubernetes.io/projected/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-kube-api-access-gffxg\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-hostroot\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-etc-kubernetes\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-os-release\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-cni-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.003942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-conf-dir\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.004149 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7b50982c-98df-4df3-9669-7a741ea95eb6-ovnkube-config\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.004217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d7db7d0e-fe44-4b7a-9633-00ced1914759-konnectivity-ca\") pod \"konnectivity-agent-c87qk\" (UID: \"d7db7d0e-fe44-4b7a-9633-00ced1914759\") " pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.004581 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-multus-daemon-config\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.005438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.005360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d7db7d0e-fe44-4b7a-9633-00ced1914759-agent-certs\") pod \"konnectivity-agent-c87qk\" (UID: \"d7db7d0e-fe44-4b7a-9633-00ced1914759\") " pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:07.005947 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.005541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7b50982c-98df-4df3-9669-7a741ea95eb6-ovn-node-metrics-cert\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.011029 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.010943 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzgzc\" (UniqueName: \"kubernetes.io/projected/f717fb0f-c30a-4760-8df6-5eb06082af1b-kube-api-access-zzgzc\") pod \"iptables-alerter-j82wk\" (UID: \"f717fb0f-c30a-4760-8df6-5eb06082af1b\") " pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:07.012002 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.011978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl9hg\" (UniqueName: \"kubernetes.io/projected/7b50982c-98df-4df3-9669-7a741ea95eb6-kube-api-access-cl9hg\") pod \"ovnkube-node-hr974\" (UID: \"7b50982c-98df-4df3-9669-7a741ea95eb6\") " pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.012002 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.012004 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gffxg\" (UniqueName: \"kubernetes.io/projected/6fd7c0bf-ef91-422c-8dc8-5bc7192ade41-kube-api-access-gffxg\") pod \"multus-pgqq4\" (UID: \"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41\") " pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.037442 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.037395 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:07.062976 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.062950 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:07.088312 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.088273 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pj447" Apr 17 17:25:07.096187 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.096162 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-plmjc" Apr 17 17:25:07.109973 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.109949 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gcsv8" Apr 17 17:25:07.117668 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.117644 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" Apr 17 17:25:07.123276 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.123254 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" Apr 17 17:25:07.129881 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.129850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:07.136550 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.136528 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pgqq4" Apr 17 17:25:07.144290 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.144263 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j82wk" Apr 17 17:25:07.149994 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.149973 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:07.407332 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.407251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:07.407536 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:07.407378 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:07.407536 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:07.407459 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs podName:54c39df0-963a-429e-b7e9-1cf754453932 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:08.407438073 +0000 UTC m=+4.075446634 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs") pod "network-metrics-daemon-6hw86" (UID: "54c39df0-963a-429e-b7e9-1cf754453932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:07.508240 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.508203 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:07.508451 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:07.508390 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:07.508451 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:07.508419 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:07.508451 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:07.508447 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6vkh2 for pod openshift-network-diagnostics/network-check-target-56d9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:07.508612 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:07.508505 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2 podName:3dbf031f-03a8-4194-a694-20fe7307d30f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:08.508487889 +0000 UTC m=+4.176496457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6vkh2" (UniqueName: "kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2") pod "network-check-target-56d9d" (UID: "3dbf031f-03a8-4194-a694-20fe7307d30f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:07.632154 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.632130 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod716e40c1_df03_46db_92f3_31f34b85f083.slice/crio-cb16bd0decda45bc1c84213ec4c7762553edc333fd45b41c59f09d674a52380f WatchSource:0}: Error finding container cb16bd0decda45bc1c84213ec4c7762553edc333fd45b41c59f09d674a52380f: Status 404 returned error can't find the container with id cb16bd0decda45bc1c84213ec4c7762553edc333fd45b41c59f09d674a52380f Apr 17 17:25:07.635200 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.635177 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ff89a2c_d122_45ca_be53_8716d7af6f26.slice/crio-379cbbc654d79f583c7e98566fc2a4f33301907aea35b5dde100aabd4b80c668 WatchSource:0}: Error finding container 379cbbc654d79f583c7e98566fc2a4f33301907aea35b5dde100aabd4b80c668: Status 404 returned error can't find the container with id 379cbbc654d79f583c7e98566fc2a4f33301907aea35b5dde100aabd4b80c668 Apr 17 17:25:07.637387 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.637351 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd7c0bf_ef91_422c_8dc8_5bc7192ade41.slice/crio-d522a2b391dab55692f5d56eac7307caf317fe669b33c1d648640cd705732a68 WatchSource:0}: Error finding container d522a2b391dab55692f5d56eac7307caf317fe669b33c1d648640cd705732a68: Status 404 returned error can't find the container with id d522a2b391dab55692f5d56eac7307caf317fe669b33c1d648640cd705732a68 Apr 17 17:25:07.637956 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.637925 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7db7d0e_fe44_4b7a_9633_00ced1914759.slice/crio-6dc607ac10e6e8d5e5b092473b7acd2d257720dce1916bbd30f760f237217757 WatchSource:0}: Error finding container 6dc607ac10e6e8d5e5b092473b7acd2d257720dce1916bbd30f760f237217757: Status 404 returned error can't find the container with id 6dc607ac10e6e8d5e5b092473b7acd2d257720dce1916bbd30f760f237217757 Apr 17 17:25:07.638811 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.638780 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf717fb0f_c30a_4760_8df6_5eb06082af1b.slice/crio-7c9d100f2537a12df209c2d118c580d86f607d76ca5bd4929158d2c40ccb3ce1 WatchSource:0}: Error finding container 7c9d100f2537a12df209c2d118c580d86f607d76ca5bd4929158d2c40ccb3ce1: Status 404 returned error can't find the container with id 7c9d100f2537a12df209c2d118c580d86f607d76ca5bd4929158d2c40ccb3ce1 Apr 17 17:25:07.641417 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.641391 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cfa6ba8_721d_4b42_963e_828ffe17cdbd.slice/crio-bb28c2a74772859cf35bbd4c5575207ec8c6ddedbeca8a30dbf84904b33520bb WatchSource:0}: Error finding container bb28c2a74772859cf35bbd4c5575207ec8c6ddedbeca8a30dbf84904b33520bb: Status 404 returned error can't find the container with id bb28c2a74772859cf35bbd4c5575207ec8c6ddedbeca8a30dbf84904b33520bb Apr 17 17:25:07.641884 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.641854 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b50982c_98df_4df3_9669_7a741ea95eb6.slice/crio-e21cfafe75897621629dad4de39d2635b4bef7118cf1b9e797a96bb833307d81 WatchSource:0}: Error finding container e21cfafe75897621629dad4de39d2635b4bef7118cf1b9e797a96bb833307d81: Status 404 returned error can't find the container with id e21cfafe75897621629dad4de39d2635b4bef7118cf1b9e797a96bb833307d81 Apr 17 17:25:07.643165 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.643132 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c913c62_8158_4429_8205_ec0b912f3e95.slice/crio-e6a92974213e640060c4d6e8ea89b1593f224cd626d82cd9dd9b4392c38be3b4 WatchSource:0}: Error finding container e6a92974213e640060c4d6e8ea89b1593f224cd626d82cd9dd9b4392c38be3b4: Status 404 returned error can't find the container with id e6a92974213e640060c4d6e8ea89b1593f224cd626d82cd9dd9b4392c38be3b4 Apr 17 17:25:07.643714 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:07.643691 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b573982_e564_43dc_809a_f117e117fa31.slice/crio-5d5ddcc8f9ee20004f419230d5bba7799e298985835c5737bd86c257cd93a4ef WatchSource:0}: Error finding container 5d5ddcc8f9ee20004f419230d5bba7799e298985835c5737bd86c257cd93a4ef: Status 404 returned error can't find the container with id 5d5ddcc8f9ee20004f419230d5bba7799e298985835c5737bd86c257cd93a4ef Apr 17 17:25:07.841725 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.841538 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:20:05 +0000 UTC" deadline="2027-10-08 09:41:29.728935433 +0000 UTC" Apr 17 17:25:07.841725 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.841720 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12928h16m21.887218749s" Apr 17 17:25:07.928564 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.928453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerStarted","Data":"5d5ddcc8f9ee20004f419230d5bba7799e298985835c5737bd86c257cd93a4ef"} Apr 17 17:25:07.929470 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.929439 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gcsv8" event={"ID":"3c913c62-8158-4429-8205-ec0b912f3e95","Type":"ContainerStarted","Data":"e6a92974213e640060c4d6e8ea89b1593f224cd626d82cd9dd9b4392c38be3b4"} Apr 17 17:25:07.930541 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.930517 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"e21cfafe75897621629dad4de39d2635b4bef7118cf1b9e797a96bb833307d81"} Apr 17 17:25:07.931403 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.931379 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-plmjc" event={"ID":"9cfa6ba8-721d-4b42-963e-828ffe17cdbd","Type":"ContainerStarted","Data":"bb28c2a74772859cf35bbd4c5575207ec8c6ddedbeca8a30dbf84904b33520bb"} Apr 17 17:25:07.935095 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.935071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j82wk" event={"ID":"f717fb0f-c30a-4760-8df6-5eb06082af1b","Type":"ContainerStarted","Data":"7c9d100f2537a12df209c2d118c580d86f607d76ca5bd4929158d2c40ccb3ce1"} Apr 17 17:25:07.940603 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.940570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pj447" event={"ID":"716e40c1-df03-46db-92f3-31f34b85f083","Type":"ContainerStarted","Data":"cb16bd0decda45bc1c84213ec4c7762553edc333fd45b41c59f09d674a52380f"} Apr 17 17:25:07.943067 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.943042 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" event={"ID":"e647b5a59d131d511f2d693e03bddab4","Type":"ContainerStarted","Data":"c7cd2728d426241a04e46d229dbb28a3c10a9b9dd3e361a26e4581ef6ecd1fa4"} Apr 17 17:25:07.947360 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.947329 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c87qk" event={"ID":"d7db7d0e-fe44-4b7a-9633-00ced1914759","Type":"ContainerStarted","Data":"6dc607ac10e6e8d5e5b092473b7acd2d257720dce1916bbd30f760f237217757"} Apr 17 17:25:07.949245 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.949222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pgqq4" event={"ID":"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41","Type":"ContainerStarted","Data":"d522a2b391dab55692f5d56eac7307caf317fe669b33c1d648640cd705732a68"} Apr 17 17:25:07.950140 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.950124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" event={"ID":"3ff89a2c-d122-45ca-be53-8716d7af6f26","Type":"ContainerStarted","Data":"379cbbc654d79f583c7e98566fc2a4f33301907aea35b5dde100aabd4b80c668"} Apr 17 17:25:07.959347 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:07.959295 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-192.ec2.internal" podStartSLOduration=2.959277855 podStartE2EDuration="2.959277855s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:07.958901327 +0000 UTC m=+3.626909895" watchObservedRunningTime="2026-04-17 17:25:07.959277855 +0000 UTC m=+3.627286422" Apr 17 17:25:08.415110 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:08.414220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:08.415110 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:08.414384 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:08.415110 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:08.414466 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs podName:54c39df0-963a-429e-b7e9-1cf754453932 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.414446569 +0000 UTC m=+6.082455120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs") pod "network-metrics-daemon-6hw86" (UID: "54c39df0-963a-429e-b7e9-1cf754453932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:08.515259 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:08.515209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:08.515458 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:08.515373 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:08.515458 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:08.515393 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:08.515458 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:08.515407 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6vkh2 for pod openshift-network-diagnostics/network-check-target-56d9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:08.515621 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:08.515479 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2 podName:3dbf031f-03a8-4194-a694-20fe7307d30f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.515460442 +0000 UTC m=+6.183469005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6vkh2" (UniqueName: "kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2") pod "network-check-target-56d9d" (UID: "3dbf031f-03a8-4194-a694-20fe7307d30f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:08.922818 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:08.922783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:08.923260 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:08.922933 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:08.923439 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:08.923389 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:08.923559 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:08.923537 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:08.964710 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:08.964653 2573 generic.go:358] "Generic (PLEG): container finished" podID="7bd544cd73c740ecb462759975bc68e0" containerID="2640aab00f8d582d7709ceb2ac3271f59f79b7bc68f768f5915556b2d0ae306c" exitCode=0 Apr 17 17:25:08.965501 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:08.965472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" event={"ID":"7bd544cd73c740ecb462759975bc68e0","Type":"ContainerDied","Data":"2640aab00f8d582d7709ceb2ac3271f59f79b7bc68f768f5915556b2d0ae306c"} Apr 17 17:25:09.987700 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:09.987661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" event={"ID":"7bd544cd73c740ecb462759975bc68e0","Type":"ContainerStarted","Data":"7f568d15866a8225c6cc5655836818084a858a4443abd363f627eb695b29b42d"} Apr 17 17:25:10.431907 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:10.431826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:10.432070 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:10.431986 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.432070 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:10.432049 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs podName:54c39df0-963a-429e-b7e9-1cf754453932 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:14.432032096 +0000 UTC m=+10.100040657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs") pod "network-metrics-daemon-6hw86" (UID: "54c39df0-963a-429e-b7e9-1cf754453932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.533960 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:10.533257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:10.533960 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:10.533483 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:10.533960 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:10.533504 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:10.533960 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:10.533517 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6vkh2 for pod openshift-network-diagnostics/network-check-target-56d9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:10.533960 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:10.533577 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2 podName:3dbf031f-03a8-4194-a694-20fe7307d30f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:14.533559578 +0000 UTC m=+10.201568135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6vkh2" (UniqueName: "kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2") pod "network-check-target-56d9d" (UID: "3dbf031f-03a8-4194-a694-20fe7307d30f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:10.920315 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:10.920210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:10.920485 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:10.920348 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:10.920736 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:10.920719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:10.920848 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:10.920829 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:12.920643 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:12.920608 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:12.921092 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:12.920764 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:12.921173 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:12.921154 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:12.921346 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:12.921322 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:14.464937 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:14.464887 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:14.465554 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:14.465054 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:14.465554 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:14.465118 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs podName:54c39df0-963a-429e-b7e9-1cf754453932 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:22.465100101 +0000 UTC m=+18.133108651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs") pod "network-metrics-daemon-6hw86" (UID: "54c39df0-963a-429e-b7e9-1cf754453932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:14.565741 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:14.565665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:14.565941 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:14.565853 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:14.565941 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:14.565914 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:14.565941 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:14.565928 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6vkh2 for pod openshift-network-diagnostics/network-check-target-56d9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:14.566103 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:14.565992 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2 podName:3dbf031f-03a8-4194-a694-20fe7307d30f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:22.565972896 +0000 UTC m=+18.233981462 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6vkh2" (UniqueName: "kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2") pod "network-check-target-56d9d" (UID: "3dbf031f-03a8-4194-a694-20fe7307d30f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:14.920832 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:14.920752 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:14.920991 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:14.920865 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:14.921263 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:14.921239 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:14.921374 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:14.921352 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:16.920100 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:16.920062 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:16.920544 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:16.920066 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:16.920544 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:16.920189 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:16.920544 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:16.920314 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:18.919708 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:18.919676 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:18.919708 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:18.919710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:18.920201 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:18.919796 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:18.920201 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:18.919920 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:20.920567 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:20.920529 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:20.920982 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:20.920643 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:20.920982 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:20.920705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:20.920982 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:20.920822 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:22.530732 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:22.530689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:22.531217 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:22.530865 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:22.531217 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:22.530953 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs podName:54c39df0-963a-429e-b7e9-1cf754453932 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:38.530930184 +0000 UTC m=+34.198938732 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs") pod "network-metrics-daemon-6hw86" (UID: "54c39df0-963a-429e-b7e9-1cf754453932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:22.631787 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:22.631750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:22.631955 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:22.631884 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:22.631955 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:22.631903 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:22.631955 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:22.631912 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6vkh2 for pod openshift-network-diagnostics/network-check-target-56d9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:22.632063 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:22.631964 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2 podName:3dbf031f-03a8-4194-a694-20fe7307d30f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:38.631948651 +0000 UTC m=+34.299957202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6vkh2" (UniqueName: "kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2") pod "network-check-target-56d9d" (UID: "3dbf031f-03a8-4194-a694-20fe7307d30f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:22.920269 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:22.920191 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:22.920421 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:22.920329 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:22.920421 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:22.920381 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:22.920527 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:22.920490 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:24.921910 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:24.921010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:24.921910 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:24.921129 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:24.921910 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:24.921546 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:24.921910 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:24.921651 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:26.037332 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.037302 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b573982-e564-43dc-809a-f117e117fa31" containerID="c015496fd0c507959eec88e54dc9665bdb07b6d3b278656c4bb5c3ad336a6aab" exitCode=0 Apr 17 17:25:26.038096 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.037383 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerDied","Data":"c015496fd0c507959eec88e54dc9665bdb07b6d3b278656c4bb5c3ad336a6aab"} Apr 17 17:25:26.041204 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.039141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gcsv8" event={"ID":"3c913c62-8158-4429-8205-ec0b912f3e95","Type":"ContainerStarted","Data":"ad0eff41ddf0e2dfb3c8dd9c468317c1e48494e1232f3c3c476901a92d8b0750"} Apr 17 17:25:26.044022 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.044006 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:25:26.044349 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.044328 2573 generic.go:358] "Generic (PLEG): container finished" podID="7b50982c-98df-4df3-9669-7a741ea95eb6" containerID="6b0a9d43683b972f477060dca0703099274445cae6f796e6e5906f7db922a610" exitCode=1 Apr 17 17:25:26.044473 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.044384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"cbc3abca5cb20871f60c45cd4b7400cd27d91f24355224a3f4aff480f2de4eb6"} Apr 17 17:25:26.044473 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.044403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"f883cb72c98aeb687c5860bde77c07146dcd55850fde02b33256f92af37ec3e4"} Apr 17 17:25:26.044473 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.044415 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"08aeec8fe1cd3d8c9f76ba56002b9b6cb43dc0bce690a334a551315369ccf614"} Apr 17 17:25:26.044473 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.044465 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"06af0f07f41735156faa18137ac0be761367f4c74ac6ee4938c34a57a5a49d64"} Apr 17 17:25:26.044637 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.044479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerDied","Data":"6b0a9d43683b972f477060dca0703099274445cae6f796e6e5906f7db922a610"} Apr 17 17:25:26.044637 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.044493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"22ddbc82efc097a1204996537b6d4d54d3a038d54e2cc5568df13c5d6f059dbd"} Apr 17 17:25:26.045478 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.045454 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-plmjc" event={"ID":"9cfa6ba8-721d-4b42-963e-828ffe17cdbd","Type":"ContainerStarted","Data":"93afc921abb3dc930db729fa544fbe9a7f5f14c3dd5f7e45a8ae3f13c945055e"} Apr 17 17:25:26.046659 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.046630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pj447" event={"ID":"716e40c1-df03-46db-92f3-31f34b85f083","Type":"ContainerStarted","Data":"00495663ff6205310d62787b992f9fe3b88e014ab810ef26072df5634d40674a"} Apr 17 17:25:26.047947 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.047926 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c87qk" event={"ID":"d7db7d0e-fe44-4b7a-9633-00ced1914759","Type":"ContainerStarted","Data":"7fc8afdac0f71c528548478d144a0539a5719e501c3bebd63787be2fed051d04"} Apr 17 17:25:26.049056 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.049040 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pgqq4" event={"ID":"6fd7c0bf-ef91-422c-8dc8-5bc7192ade41","Type":"ContainerStarted","Data":"823b80d833ea03e09b6b2e5cc1062b61a2b9427948df1476a651d224705ddc7f"} Apr 17 17:25:26.050205 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.050188 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" event={"ID":"3ff89a2c-d122-45ca-be53-8716d7af6f26","Type":"ContainerStarted","Data":"ad004874b156fa565077dd7d646441305b17bb3ed054f0b5d48309416b7a0b83"} Apr 17 17:25:26.061730 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.061694 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-192.ec2.internal" podStartSLOduration=21.061684053 podStartE2EDuration="21.061684053s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:10.004981686 +0000 UTC m=+5.672990254" watchObservedRunningTime="2026-04-17 17:25:26.061684053 +0000 UTC m=+21.729692619" Apr 17 17:25:26.074801 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.074761 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gcsv8" podStartSLOduration=12.025654001 podStartE2EDuration="21.074747348s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.646315879 +0000 UTC m=+3.314324429" lastFinishedPulling="2026-04-17 17:25:16.695409228 +0000 UTC m=+12.363417776" observedRunningTime="2026-04-17 17:25:26.074691804 +0000 UTC m=+21.742700373" watchObservedRunningTime="2026-04-17 17:25:26.074747348 +0000 UTC m=+21.742755916" Apr 17 17:25:26.092860 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.092825 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pj447" podStartSLOduration=3.892505816 podStartE2EDuration="21.09281263s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.63429014 +0000 UTC m=+3.302298689" lastFinishedPulling="2026-04-17 17:25:24.834596941 +0000 UTC m=+20.502605503" observedRunningTime="2026-04-17 17:25:26.092715556 +0000 UTC m=+21.760724134" watchObservedRunningTime="2026-04-17 17:25:26.09281263 +0000 UTC m=+21.760821196" Apr 17 17:25:26.109024 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.108960 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-c87qk" podStartSLOduration=3.921193173 podStartE2EDuration="21.108948939s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.645207859 +0000 UTC m=+3.313216408" lastFinishedPulling="2026-04-17 17:25:24.832963616 +0000 UTC m=+20.500972174" observedRunningTime="2026-04-17 17:25:26.108702554 +0000 UTC m=+21.776711121" watchObservedRunningTime="2026-04-17 17:25:26.108948939 +0000 UTC m=+21.776957505" Apr 17 17:25:26.128059 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.128019 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pgqq4" podStartSLOduration=3.923349455 podStartE2EDuration="21.128007106s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.639979714 +0000 UTC m=+3.307988274" lastFinishedPulling="2026-04-17 17:25:24.844637369 +0000 UTC m=+20.512645925" observedRunningTime="2026-04-17 17:25:26.127802347 +0000 UTC m=+21.795810915" watchObservedRunningTime="2026-04-17 17:25:26.128007106 +0000 UTC m=+21.796015673" Apr 17 17:25:26.145715 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.145673 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-plmjc" podStartSLOduration=3.957659192 podStartE2EDuration="21.145661084s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.644927327 +0000 UTC m=+3.312935878" lastFinishedPulling="2026-04-17 17:25:24.832929221 +0000 UTC m=+20.500937770" observedRunningTime="2026-04-17 17:25:26.145615768 +0000 UTC m=+21.813624334" watchObservedRunningTime="2026-04-17 17:25:26.145661084 +0000 UTC m=+21.813669651" Apr 17 17:25:26.432018 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.431982 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:25:26.851573 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.851397 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:25:26.432005567Z","UUID":"75589639-4983-426f-9d7b-1efdf6f2895d","Handler":null,"Name":"","Endpoint":""} Apr 17 17:25:26.853295 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.853267 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:25:26.853449 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.853301 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:25:26.920233 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.920192 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:26.920417 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:26.920328 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:26.920417 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:26.920203 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:26.920555 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:26.920523 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:27.054154 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:27.054118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j82wk" event={"ID":"f717fb0f-c30a-4760-8df6-5eb06082af1b","Type":"ContainerStarted","Data":"70b230bcc726de7d3728565f879bf8c6048e7d427098e0c799b672f87a08fc49"} Apr 17 17:25:27.056664 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:27.056612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" event={"ID":"3ff89a2c-d122-45ca-be53-8716d7af6f26","Type":"ContainerStarted","Data":"b07786375bd206fa5b6bc5d40b2eacd5bb3830f1dd3862a67a616f6be2cd6414"} Apr 17 17:25:27.072168 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:27.072085 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-j82wk" podStartSLOduration=4.934987491 podStartE2EDuration="22.072065201s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.6452681 +0000 UTC m=+3.313276659" lastFinishedPulling="2026-04-17 17:25:24.782345813 +0000 UTC m=+20.450354369" observedRunningTime="2026-04-17 17:25:27.071296778 +0000 UTC m=+22.739305345" watchObservedRunningTime="2026-04-17 17:25:27.072065201 +0000 UTC m=+22.740073768" Apr 17 17:25:27.131560 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:27.131457 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:27.132345 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:27.132314 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:28.060448 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:28.060349 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" event={"ID":"3ff89a2c-d122-45ca-be53-8716d7af6f26","Type":"ContainerStarted","Data":"cf1b1d1edc7e3fb7e981a47e351bab27c6e56fec44727f36e540ff375efc2847"} Apr 17 17:25:28.063101 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:28.063062 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:25:28.063480 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:28.063453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"907c1994c50d2e5fc6a8cbaf7f020a844eb2181a94cd7beadd2d0d550e3891ae"} Apr 17 17:25:28.063990 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:28.063798 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:28.064278 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:28.064250 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-c87qk" Apr 17 17:25:28.078826 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:28.078782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4225d" podStartSLOduration=3.307160802 podStartE2EDuration="23.078770459s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.637041664 +0000 UTC m=+3.305050216" lastFinishedPulling="2026-04-17 17:25:27.408651313 +0000 UTC m=+23.076659873" observedRunningTime="2026-04-17 17:25:28.078099852 +0000 UTC m=+23.746108442" watchObservedRunningTime="2026-04-17 17:25:28.078770459 +0000 UTC m=+23.746779017" Apr 17 17:25:28.920441 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:28.920247 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:28.920628 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:28.920247 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:28.920628 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:28.920572 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:28.920628 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:28.920618 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:30.923129 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:30.922963 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:30.923917 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:30.922977 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:30.923917 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:30.923209 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:30.923917 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:30.923292 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:31.072635 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:31.072610 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:25:31.072980 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:31.072949 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"717fffae258f30a19512e73aea0711149302dcc6e6aa674ee9818f733ec6aff5"} Apr 17 17:25:31.073221 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:31.073199 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:31.073478 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:31.073456 2573 scope.go:117] "RemoveContainer" containerID="6b0a9d43683b972f477060dca0703099274445cae6f796e6e5906f7db922a610" Apr 17 17:25:31.074633 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:31.074612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerStarted","Data":"00d8ae3e5edd5f38b4cadf9a0cbba091ff7a4cce288d20036ca66149f17c4467"} Apr 17 17:25:31.088226 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:31.088195 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:31.537716 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:31.537689 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:32.077998 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.077961 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b573982-e564-43dc-809a-f117e117fa31" containerID="00d8ae3e5edd5f38b4cadf9a0cbba091ff7a4cce288d20036ca66149f17c4467" exitCode=0 Apr 17 17:25:32.078418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.078051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerDied","Data":"00d8ae3e5edd5f38b4cadf9a0cbba091ff7a4cce288d20036ca66149f17c4467"} Apr 17 17:25:32.081442 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.081409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:25:32.081720 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.081700 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" event={"ID":"7b50982c-98df-4df3-9669-7a741ea95eb6","Type":"ContainerStarted","Data":"021b7dcc93829a918682aaf47d87cde93b1129f60af41a010be5aef42f7fade0"} Apr 17 17:25:32.082016 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.081992 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:32.095893 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.095874 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:25:32.134276 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.134226 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" podStartSLOduration=9.873263304 podStartE2EDuration="27.134213614s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.645499879 +0000 UTC m=+3.313508430" lastFinishedPulling="2026-04-17 17:25:24.906450195 +0000 UTC m=+20.574458740" observedRunningTime="2026-04-17 17:25:32.133710004 +0000 UTC m=+27.801718570" watchObservedRunningTime="2026-04-17 17:25:32.134213614 +0000 UTC m=+27.802222180" Apr 17 17:25:32.920320 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.920280 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:32.920320 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:32.920300 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:32.920531 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:32.920384 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:32.920585 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:32.920533 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:34.087370 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:34.087340 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b573982-e564-43dc-809a-f117e117fa31" containerID="2a4962ed97126201376530938cd9185e7bf2f35a646681109815fddbc60f8a3b" exitCode=0 Apr 17 17:25:34.087765 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:34.087455 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerDied","Data":"2a4962ed97126201376530938cd9185e7bf2f35a646681109815fddbc60f8a3b"} Apr 17 17:25:34.920785 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:34.920749 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:34.920935 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:34.920847 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:34.920935 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:34.920920 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:34.921031 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:34.921013 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:36.093066 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:36.093031 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b573982-e564-43dc-809a-f117e117fa31" containerID="db6a1be191cae87fea7f72271e41371182ca43f3ec90957c96534caf3ffb72cc" exitCode=0 Apr 17 17:25:36.093595 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:36.093084 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerDied","Data":"db6a1be191cae87fea7f72271e41371182ca43f3ec90957c96534caf3ffb72cc"} Apr 17 17:25:36.925169 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:36.925138 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:36.925330 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:36.925138 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:36.925330 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:36.925251 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:36.925418 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:36.925386 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:38.544229 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:38.544176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:38.544842 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:38.544328 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:38.544842 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:38.544396 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs podName:54c39df0-963a-429e-b7e9-1cf754453932 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:10.544375773 +0000 UTC m=+66.212384318 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs") pod "network-metrics-daemon-6hw86" (UID: "54c39df0-963a-429e-b7e9-1cf754453932") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:38.644782 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:38.644746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:38.644981 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:38.644903 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:38.644981 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:38.644922 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:38.644981 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:38.644934 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6vkh2 for pod openshift-network-diagnostics/network-check-target-56d9d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:38.645147 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:38.644999 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2 podName:3dbf031f-03a8-4194-a694-20fe7307d30f nodeName:}" failed. No retries permitted until 2026-04-17 17:26:10.644979987 +0000 UTC m=+66.312988534 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6vkh2" (UniqueName: "kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2") pod "network-check-target-56d9d" (UID: "3dbf031f-03a8-4194-a694-20fe7307d30f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:38.922662 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:38.921782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:38.922662 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:38.921931 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:38.922662 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:38.922260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:38.922662 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:38.922351 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:39.862369 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:39.862109 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6hw86"] Apr 17 17:25:39.862839 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:39.862486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:39.862839 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:39.862630 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:39.865558 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:39.865535 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-56d9d"] Apr 17 17:25:39.865697 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:39.865633 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:39.865759 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:39.865731 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:41.920180 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:41.920126 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:41.920778 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:41.920126 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:41.920778 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:41.920268 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-56d9d" podUID="3dbf031f-03a8-4194-a694-20fe7307d30f" Apr 17 17:25:41.920778 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:41.920386 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6hw86" podUID="54c39df0-963a-429e-b7e9-1cf754453932" Apr 17 17:25:42.107158 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:42.107073 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b573982-e564-43dc-809a-f117e117fa31" containerID="4eaf3a5a13b75616801729643ded5fda5121fd448c0c2a826fe5b3df16a24535" exitCode=0 Apr 17 17:25:42.107290 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:42.107149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerDied","Data":"4eaf3a5a13b75616801729643ded5fda5121fd448c0c2a826fe5b3df16a24535"} Apr 17 17:25:43.111520 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.111490 2573 generic.go:358] "Generic (PLEG): container finished" podID="5b573982-e564-43dc-809a-f117e117fa31" containerID="f9bac7e16f1091fe7a0698d4001c9c7590c2e2d274b43c9117954f184a311836" exitCode=0 Apr 17 17:25:43.111917 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.111544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerDied","Data":"f9bac7e16f1091fe7a0698d4001c9c7590c2e2d274b43c9117954f184a311836"} Apr 17 17:25:43.703723 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.703695 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-192.ec2.internal" event="NodeReady" Apr 17 17:25:43.703884 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.703808 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:25:43.755644 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.755610 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zjcw2"] Apr 17 17:25:43.758161 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.758144 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zjcw2" Apr 17 17:25:43.761613 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.761536 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:25:43.761820 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.761794 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:25:43.761954 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.761875 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:25:43.761954 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.761536 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-n9t7x\"" Apr 17 17:25:43.764554 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.763512 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-w7lff"] Apr 17 17:25:43.766301 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.766261 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.767892 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.767873 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zjcw2"] Apr 17 17:25:43.768797 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.768781 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:25:43.768867 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.768796 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:25:43.769042 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.768989 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:25:43.769148 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.769046 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4n9s5\"" Apr 17 17:25:43.769314 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.769291 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:25:43.776555 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.776531 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-w7lff"] Apr 17 17:25:43.782526 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.782504 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zz2w5"] Apr 17 17:25:43.784560 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.784541 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.786946 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.786929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:25:43.787052 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.786975 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:25:43.787052 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.787003 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9l5w8\"" Apr 17 17:25:43.794057 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.794034 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zz2w5"] Apr 17 17:25:43.885737 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.885696 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-tmp-dir\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.885927 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.885792 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-config-volume\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.885927 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.885817 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98ef03b7-627d-4085-85f1-8f4765fe1c45-cert\") pod \"ingress-canary-zjcw2\" (UID: \"98ef03b7-627d-4085-85f1-8f4765fe1c45\") " pod="openshift-ingress-canary/ingress-canary-zjcw2" Apr 17 17:25:43.885927 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.885845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz74k\" (UniqueName: \"kubernetes.io/projected/98ef03b7-627d-4085-85f1-8f4765fe1c45-kube-api-access-dz74k\") pod \"ingress-canary-zjcw2\" (UID: \"98ef03b7-627d-4085-85f1-8f4765fe1c45\") " pod="openshift-ingress-canary/ingress-canary-zjcw2" Apr 17 17:25:43.885927 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.885885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-crio-socket\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.885927 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.885927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz2zr\" (UniqueName: \"kubernetes.io/projected/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-kube-api-access-rz2zr\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.886151 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.885944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.886151 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.886009 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmqnv\" (UniqueName: \"kubernetes.io/projected/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-kube-api-access-dmqnv\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.886151 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.886075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-metrics-tls\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.886151 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.886091 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-data-volume\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.886151 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.886109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.919911 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.919874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:25:43.920101 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.919874 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:25:43.922670 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.922646 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:43.922810 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.922750 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vr6gd\"" Apr 17 17:25:43.922810 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.922788 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:43.923007 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.922993 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:43.923090 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.922997 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bp9pv\"" Apr 17 17:25:43.987018 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.986980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-config-volume\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.987018 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987016 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98ef03b7-627d-4085-85f1-8f4765fe1c45-cert\") pod \"ingress-canary-zjcw2\" (UID: \"98ef03b7-627d-4085-85f1-8f4765fe1c45\") " pod="openshift-ingress-canary/ingress-canary-zjcw2" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz74k\" (UniqueName: \"kubernetes.io/projected/98ef03b7-627d-4085-85f1-8f4765fe1c45-kube-api-access-dz74k\") pod \"ingress-canary-zjcw2\" (UID: \"98ef03b7-627d-4085-85f1-8f4765fe1c45\") " pod="openshift-ingress-canary/ingress-canary-zjcw2" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-crio-socket\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz2zr\" (UniqueName: \"kubernetes.io/projected/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-kube-api-access-rz2zr\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmqnv\" (UniqueName: \"kubernetes.io/projected/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-kube-api-access-dmqnv\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-metrics-tls\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-data-volume\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-tmp-dir\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-crio-socket\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.987653 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987606 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-tmp-dir\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.988068 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.987673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-config-volume\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.988321 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.988195 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-data-volume\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.988458 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.988415 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.991087 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.991065 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-metrics-tls\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.991087 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.991077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.991672 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.991657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/98ef03b7-627d-4085-85f1-8f4765fe1c45-cert\") pod \"ingress-canary-zjcw2\" (UID: \"98ef03b7-627d-4085-85f1-8f4765fe1c45\") " pod="openshift-ingress-canary/ingress-canary-zjcw2" Apr 17 17:25:43.995228 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.995203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz2zr\" (UniqueName: \"kubernetes.io/projected/69aceb1b-03e3-432d-9fbc-1fc05822fa8a-kube-api-access-rz2zr\") pod \"dns-default-zz2w5\" (UID: \"69aceb1b-03e3-432d-9fbc-1fc05822fa8a\") " pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:43.995516 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.995495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmqnv\" (UniqueName: \"kubernetes.io/projected/3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b-kube-api-access-dmqnv\") pod \"insights-runtime-extractor-w7lff\" (UID: \"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b\") " pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:43.995728 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:43.995709 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz74k\" (UniqueName: \"kubernetes.io/projected/98ef03b7-627d-4085-85f1-8f4765fe1c45-kube-api-access-dz74k\") pod \"ingress-canary-zjcw2\" (UID: \"98ef03b7-627d-4085-85f1-8f4765fe1c45\") " pod="openshift-ingress-canary/ingress-canary-zjcw2" Apr 17 17:25:44.070454 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:44.070342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zjcw2" Apr 17 17:25:44.077236 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:44.077204 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-w7lff" Apr 17 17:25:44.093141 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:44.093112 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:44.116654 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:44.116619 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" event={"ID":"5b573982-e564-43dc-809a-f117e117fa31","Type":"ContainerStarted","Data":"0eebec8a56f2262318d4f90204e67b6106d6e88491d66f9df36d8d79d7df431c"} Apr 17 17:25:44.218498 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:44.218446 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8zj4h" podStartSLOduration=5.059335329 podStartE2EDuration="39.218411654s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:25:07.64700925 +0000 UTC m=+3.315017797" lastFinishedPulling="2026-04-17 17:25:41.806085573 +0000 UTC m=+37.474094122" observedRunningTime="2026-04-17 17:25:44.142055625 +0000 UTC m=+39.810064240" watchObservedRunningTime="2026-04-17 17:25:44.218411654 +0000 UTC m=+39.886420221" Apr 17 17:25:44.218782 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:44.218758 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-w7lff"] Apr 17 17:25:44.222965 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:44.222924 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cfa574d_b7a8_4bbc_9945_ef65d3a62e0b.slice/crio-f8a337ab175d943dc7f8fb18849bbab7d945cfc442deffbf1da74a70fab876ea WatchSource:0}: Error finding container f8a337ab175d943dc7f8fb18849bbab7d945cfc442deffbf1da74a70fab876ea: Status 404 returned error can't find the container with id f8a337ab175d943dc7f8fb18849bbab7d945cfc442deffbf1da74a70fab876ea Apr 17 17:25:44.232050 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:44.232011 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zjcw2"] Apr 17 17:25:44.235712 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:44.235685 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98ef03b7_627d_4085_85f1_8f4765fe1c45.slice/crio-8b6769eb31263928101ba9946b28de216d8ad2529a5cda05cbcae23f51003ef4 WatchSource:0}: Error finding container 8b6769eb31263928101ba9946b28de216d8ad2529a5cda05cbcae23f51003ef4: Status 404 returned error can't find the container with id 8b6769eb31263928101ba9946b28de216d8ad2529a5cda05cbcae23f51003ef4 Apr 17 17:25:44.245880 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:44.245848 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zz2w5"] Apr 17 17:25:44.254657 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:44.254623 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69aceb1b_03e3_432d_9fbc_1fc05822fa8a.slice/crio-59aa9600aa762780bb93390541a46ce3cf2582e15ce8d40abfce34fb6a966c17 WatchSource:0}: Error finding container 59aa9600aa762780bb93390541a46ce3cf2582e15ce8d40abfce34fb6a966c17: Status 404 returned error can't find the container with id 59aa9600aa762780bb93390541a46ce3cf2582e15ce8d40abfce34fb6a966c17 Apr 17 17:25:45.120589 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:45.120531 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zjcw2" event={"ID":"98ef03b7-627d-4085-85f1-8f4765fe1c45","Type":"ContainerStarted","Data":"8b6769eb31263928101ba9946b28de216d8ad2529a5cda05cbcae23f51003ef4"} Apr 17 17:25:45.121772 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:45.121740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zz2w5" event={"ID":"69aceb1b-03e3-432d-9fbc-1fc05822fa8a","Type":"ContainerStarted","Data":"59aa9600aa762780bb93390541a46ce3cf2582e15ce8d40abfce34fb6a966c17"} Apr 17 17:25:45.124113 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:45.124075 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w7lff" event={"ID":"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b","Type":"ContainerStarted","Data":"36a09de9469e7e9c62db9de1ff4275ff4e0fb89e30ab8707d421c5b058852506"} Apr 17 17:25:45.124253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:45.124125 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w7lff" event={"ID":"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b","Type":"ContainerStarted","Data":"2c5e0cae33f55e9f7436a9f6a7c7793f3f63a5f907e3d7687de53c81e0d32c20"} Apr 17 17:25:45.124253 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:45.124141 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w7lff" event={"ID":"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b","Type":"ContainerStarted","Data":"f8a337ab175d943dc7f8fb18849bbab7d945cfc442deffbf1da74a70fab876ea"} Apr 17 17:25:47.130603 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:47.130497 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zz2w5" event={"ID":"69aceb1b-03e3-432d-9fbc-1fc05822fa8a","Type":"ContainerStarted","Data":"83161f7bcd515952b1cf525e03e2f900b493acf1f57cfce0f6643725a311e8d2"} Apr 17 17:25:47.130603 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:47.130543 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zz2w5" event={"ID":"69aceb1b-03e3-432d-9fbc-1fc05822fa8a","Type":"ContainerStarted","Data":"7fba9b1f8f74152104a970f47e12e18aa45b15a78000dd75b4bdab9b2bd660ff"} Apr 17 17:25:47.131043 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:47.130614 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:47.132333 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:47.132311 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-w7lff" event={"ID":"3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b","Type":"ContainerStarted","Data":"c4b10da0704713f42115b8eafd6fdb8733872ac7a58de3630d9cbfdcc904f659"} Apr 17 17:25:47.133581 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:47.133549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zjcw2" event={"ID":"98ef03b7-627d-4085-85f1-8f4765fe1c45","Type":"ContainerStarted","Data":"499a6e42ec01cb0f3620e67729f69164e6e393ac35df3c9cdc223c6fddfd5609"} Apr 17 17:25:47.155258 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:47.155214 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zz2w5" podStartSLOduration=2.262804794 podStartE2EDuration="4.155201319s" podCreationTimestamp="2026-04-17 17:25:43 +0000 UTC" firstStartedPulling="2026-04-17 17:25:44.256973242 +0000 UTC m=+39.924981791" lastFinishedPulling="2026-04-17 17:25:46.149369756 +0000 UTC m=+41.817378316" observedRunningTime="2026-04-17 17:25:47.154823555 +0000 UTC m=+42.822832123" watchObservedRunningTime="2026-04-17 17:25:47.155201319 +0000 UTC m=+42.823209885" Apr 17 17:25:47.189509 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:47.189456 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zjcw2" podStartSLOduration=2.276287033 podStartE2EDuration="4.189420935s" podCreationTimestamp="2026-04-17 17:25:43 +0000 UTC" firstStartedPulling="2026-04-17 17:25:44.2381097 +0000 UTC m=+39.906118244" lastFinishedPulling="2026-04-17 17:25:46.151243586 +0000 UTC m=+41.819252146" observedRunningTime="2026-04-17 17:25:47.188848738 +0000 UTC m=+42.856857316" watchObservedRunningTime="2026-04-17 17:25:47.189420935 +0000 UTC m=+42.857429503" Apr 17 17:25:47.228042 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:47.227965 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-w7lff" podStartSLOduration=1.674317627 podStartE2EDuration="4.227948335s" podCreationTimestamp="2026-04-17 17:25:43 +0000 UTC" firstStartedPulling="2026-04-17 17:25:44.313034588 +0000 UTC m=+39.981043145" lastFinishedPulling="2026-04-17 17:25:46.866665308 +0000 UTC m=+42.534673853" observedRunningTime="2026-04-17 17:25:47.227489064 +0000 UTC m=+42.895497632" watchObservedRunningTime="2026-04-17 17:25:47.227948335 +0000 UTC m=+42.895956902" Apr 17 17:25:48.135321 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.135292 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bf5cbf9f4-bxhqx"] Apr 17 17:25:48.137188 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.137162 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.141385 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.141297 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:25:48.141536 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.141472 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:25:48.141536 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.141484 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:25:48.141536 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.141527 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nfmp7\"" Apr 17 17:25:48.141707 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.141577 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:25:48.143789 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.143772 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:25:48.144340 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.144323 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:25:48.144466 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.144327 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:25:48.147069 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.147047 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bf5cbf9f4-bxhqx"] Apr 17 17:25:48.323094 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.323057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-service-ca\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.323258 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.323102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h9zt\" (UniqueName: \"kubernetes.io/projected/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-kube-api-access-2h9zt\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.323258 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.323177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-config\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.323483 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.323460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-oauth-config\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.323556 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.323522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-serving-cert\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.323623 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.323579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-oauth-serving-cert\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.424358 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.424323 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-oauth-config\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.424547 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.424365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-serving-cert\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.424547 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.424382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-oauth-serving-cert\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.424547 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.424419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-service-ca\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.424547 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.424460 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h9zt\" (UniqueName: \"kubernetes.io/projected/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-kube-api-access-2h9zt\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.424547 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.424492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-config\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.425126 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.425094 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-oauth-serving-cert\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.425230 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.425166 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-service-ca\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.425230 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.425173 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-config\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.426813 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.426784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-serving-cert\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.426912 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.426868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-oauth-config\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.435558 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.435534 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h9zt\" (UniqueName: \"kubernetes.io/projected/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-kube-api-access-2h9zt\") pod \"console-5bf5cbf9f4-bxhqx\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.448346 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.448322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:48.530196 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.530170 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp"] Apr 17 17:25:48.532817 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.532800 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" Apr 17 17:25:48.535714 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.535690 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-6mclg\"" Apr 17 17:25:48.535822 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.535701 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 17:25:48.542457 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.542413 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp"] Apr 17 17:25:48.570479 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.570393 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bf5cbf9f4-bxhqx"] Apr 17 17:25:48.574919 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:48.574891 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a11a4a_ef4f_4e56_bd4d_c04199e78d20.slice/crio-760e918cc95b110018512dde27b39cf8ceabcfae3d8094fe3d6a1533f77915ca WatchSource:0}: Error finding container 760e918cc95b110018512dde27b39cf8ceabcfae3d8094fe3d6a1533f77915ca: Status 404 returned error can't find the container with id 760e918cc95b110018512dde27b39cf8ceabcfae3d8094fe3d6a1533f77915ca Apr 17 17:25:48.626092 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.626061 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/256258e8-cdda-4bc4-9dd8-63bc774b6915-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-c6jkp\" (UID: \"256258e8-cdda-4bc4-9dd8-63bc774b6915\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" Apr 17 17:25:48.726633 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:48.726545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/256258e8-cdda-4bc4-9dd8-63bc774b6915-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-c6jkp\" (UID: \"256258e8-cdda-4bc4-9dd8-63bc774b6915\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" Apr 17 17:25:48.726772 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:48.726704 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 17 17:25:48.726807 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:48.726784 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/256258e8-cdda-4bc4-9dd8-63bc774b6915-tls-certificates podName:256258e8-cdda-4bc4-9dd8-63bc774b6915 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:49.226768232 +0000 UTC m=+44.894776776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/256258e8-cdda-4bc4-9dd8-63bc774b6915-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-c6jkp" (UID: "256258e8-cdda-4bc4-9dd8-63bc774b6915") : secret "prometheus-operator-admission-webhook-tls" not found Apr 17 17:25:49.140418 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:49.140373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bf5cbf9f4-bxhqx" event={"ID":"49a11a4a-ef4f-4e56-bd4d-c04199e78d20","Type":"ContainerStarted","Data":"760e918cc95b110018512dde27b39cf8ceabcfae3d8094fe3d6a1533f77915ca"} Apr 17 17:25:49.229964 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:49.229930 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/256258e8-cdda-4bc4-9dd8-63bc774b6915-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-c6jkp\" (UID: \"256258e8-cdda-4bc4-9dd8-63bc774b6915\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" Apr 17 17:25:49.232754 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:49.232723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/256258e8-cdda-4bc4-9dd8-63bc774b6915-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-c6jkp\" (UID: \"256258e8-cdda-4bc4-9dd8-63bc774b6915\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" Apr 17 17:25:49.442713 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:49.442637 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" Apr 17 17:25:49.578541 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:49.578510 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp"] Apr 17 17:25:49.583755 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:49.583715 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod256258e8_cdda_4bc4_9dd8_63bc774b6915.slice/crio-e75442df53b1a413e564d0b95c3c6cda7e52cff80befcf4481ab2aa9c83abcad WatchSource:0}: Error finding container e75442df53b1a413e564d0b95c3c6cda7e52cff80befcf4481ab2aa9c83abcad: Status 404 returned error can't find the container with id e75442df53b1a413e564d0b95c3c6cda7e52cff80befcf4481ab2aa9c83abcad Apr 17 17:25:50.145153 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:50.145111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" event={"ID":"256258e8-cdda-4bc4-9dd8-63bc774b6915","Type":"ContainerStarted","Data":"e75442df53b1a413e564d0b95c3c6cda7e52cff80befcf4481ab2aa9c83abcad"} Apr 17 17:25:52.151842 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.151800 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bf5cbf9f4-bxhqx" event={"ID":"49a11a4a-ef4f-4e56-bd4d-c04199e78d20","Type":"ContainerStarted","Data":"1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063"} Apr 17 17:25:52.153317 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.153288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" event={"ID":"256258e8-cdda-4bc4-9dd8-63bc774b6915","Type":"ContainerStarted","Data":"78a4e3de29cb37020073f24ae0a028263340188f07dccfe59c96005625cca2cf"} Apr 17 17:25:52.153503 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.153485 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" Apr 17 17:25:52.158175 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.158153 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" Apr 17 17:25:52.169441 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.169382 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bf5cbf9f4-bxhqx" podStartSLOduration=1.457191238 podStartE2EDuration="4.169368676s" podCreationTimestamp="2026-04-17 17:25:48 +0000 UTC" firstStartedPulling="2026-04-17 17:25:48.576791839 +0000 UTC m=+44.244800384" lastFinishedPulling="2026-04-17 17:25:51.288969273 +0000 UTC m=+46.956977822" observedRunningTime="2026-04-17 17:25:52.168491815 +0000 UTC m=+47.836500380" watchObservedRunningTime="2026-04-17 17:25:52.169368676 +0000 UTC m=+47.837377245" Apr 17 17:25:52.183844 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.183791 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-c6jkp" podStartSLOduration=2.482970842 podStartE2EDuration="4.183773424s" podCreationTimestamp="2026-04-17 17:25:48 +0000 UTC" firstStartedPulling="2026-04-17 17:25:49.586066281 +0000 UTC m=+45.254074832" lastFinishedPulling="2026-04-17 17:25:51.286868869 +0000 UTC m=+46.954877414" observedRunningTime="2026-04-17 17:25:52.183141178 +0000 UTC m=+47.851149745" watchObservedRunningTime="2026-04-17 17:25:52.183773424 +0000 UTC m=+47.851781991" Apr 17 17:25:52.597364 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.597281 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-xt4mn"] Apr 17 17:25:52.609351 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.609321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.610873 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.610845 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-xt4mn"] Apr 17 17:25:52.611786 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.611766 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 17:25:52.613074 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.613049 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:25:52.613074 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.613069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:25:52.613245 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.613091 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-dc8pv\"" Apr 17 17:25:52.613245 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.613107 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:25:52.613245 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.613092 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 17:25:52.656107 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.656068 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f401d650-f1a7-427c-9b26-1fbb623c4372-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.656241 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.656119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f401d650-f1a7-427c-9b26-1fbb623c4372-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.656241 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.656139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d84pn\" (UniqueName: \"kubernetes.io/projected/f401d650-f1a7-427c-9b26-1fbb623c4372-kube-api-access-d84pn\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.656241 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.656208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f401d650-f1a7-427c-9b26-1fbb623c4372-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.757176 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.757137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d84pn\" (UniqueName: \"kubernetes.io/projected/f401d650-f1a7-427c-9b26-1fbb623c4372-kube-api-access-d84pn\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.757321 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.757184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f401d650-f1a7-427c-9b26-1fbb623c4372-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.757321 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.757247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f401d650-f1a7-427c-9b26-1fbb623c4372-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.757321 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.757314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f401d650-f1a7-427c-9b26-1fbb623c4372-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.758021 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.757997 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f401d650-f1a7-427c-9b26-1fbb623c4372-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.759614 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.759590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/f401d650-f1a7-427c-9b26-1fbb623c4372-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.759614 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.759603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f401d650-f1a7-427c-9b26-1fbb623c4372-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.766373 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.766348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d84pn\" (UniqueName: \"kubernetes.io/projected/f401d650-f1a7-427c-9b26-1fbb623c4372-kube-api-access-d84pn\") pod \"prometheus-operator-5676c8c784-xt4mn\" (UID: \"f401d650-f1a7-427c-9b26-1fbb623c4372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:52.918188 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:52.918152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" Apr 17 17:25:53.042329 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:53.042292 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-xt4mn"] Apr 17 17:25:53.044876 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:53.044845 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf401d650_f1a7_427c_9b26_1fbb623c4372.slice/crio-1c7cba3151d72c40d49494060b97f86bd9eb4fda4dfe197043aade09eb12f09f WatchSource:0}: Error finding container 1c7cba3151d72c40d49494060b97f86bd9eb4fda4dfe197043aade09eb12f09f: Status 404 returned error can't find the container with id 1c7cba3151d72c40d49494060b97f86bd9eb4fda4dfe197043aade09eb12f09f Apr 17 17:25:53.156496 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:53.156448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" event={"ID":"f401d650-f1a7-427c-9b26-1fbb623c4372","Type":"ContainerStarted","Data":"1c7cba3151d72c40d49494060b97f86bd9eb4fda4dfe197043aade09eb12f09f"} Apr 17 17:25:55.163441 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.163397 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" event={"ID":"f401d650-f1a7-427c-9b26-1fbb623c4372","Type":"ContainerStarted","Data":"27946e5527138e60bc854dd4ab0a834afd38c4fdb4b0e7406be0fffced426405"} Apr 17 17:25:55.163824 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.163453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" event={"ID":"f401d650-f1a7-427c-9b26-1fbb623c4372","Type":"ContainerStarted","Data":"e4810a53e90a7c4a64562d7ad50bc04b86c0f97f4c1c406b75af78f882b022b2"} Apr 17 17:25:55.183102 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.183056 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-xt4mn" podStartSLOduration=1.546126035 podStartE2EDuration="3.183041986s" podCreationTimestamp="2026-04-17 17:25:52 +0000 UTC" firstStartedPulling="2026-04-17 17:25:53.046773642 +0000 UTC m=+48.714782187" lastFinishedPulling="2026-04-17 17:25:54.683689592 +0000 UTC m=+50.351698138" observedRunningTime="2026-04-17 17:25:55.182735621 +0000 UTC m=+50.850744199" watchObservedRunningTime="2026-04-17 17:25:55.183041986 +0000 UTC m=+50.851050553" Apr 17 17:25:55.706295 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.706265 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-775565c6cf-d4kzv"] Apr 17 17:25:55.729496 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.729470 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-775565c6cf-d4kzv"] Apr 17 17:25:55.729635 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.729579 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.737595 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.737574 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:25:55.778613 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.778579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-oauth-serving-cert\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.778613 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.778613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq2v8\" (UniqueName: \"kubernetes.io/projected/9b1aa968-7a05-4027-a8d7-648be60980c0-kube-api-access-xq2v8\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.778853 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.778635 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-console-config\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.778853 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.778716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-service-ca\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.778853 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.778792 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-serving-cert\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.778853 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.778821 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-trusted-ca-bundle\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.779000 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.778904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-oauth-config\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.879559 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.879524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-oauth-config\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.879743 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.879576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-oauth-serving-cert\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.879743 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.879601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq2v8\" (UniqueName: \"kubernetes.io/projected/9b1aa968-7a05-4027-a8d7-648be60980c0-kube-api-access-xq2v8\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.879743 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.879627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-console-config\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.879743 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.879667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-service-ca\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.879743 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.879704 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-serving-cert\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.880007 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.879830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-trusted-ca-bundle\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.880419 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.880397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-console-config\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.880545 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.880453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-oauth-serving-cert\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.880609 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.880556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-service-ca\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.882092 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.882070 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-oauth-config\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.882211 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.882174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-serving-cert\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.888690 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.888668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq2v8\" (UniqueName: \"kubernetes.io/projected/9b1aa968-7a05-4027-a8d7-648be60980c0-kube-api-access-xq2v8\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:55.894873 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:55.894849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-trusted-ca-bundle\") pod \"console-775565c6cf-d4kzv\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:56.039301 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:56.039215 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:25:56.157730 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:56.157699 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-775565c6cf-d4kzv"] Apr 17 17:25:56.160569 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:56.160542 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1aa968_7a05_4027_a8d7_648be60980c0.slice/crio-594e405c27343c364a33f9e4072ecfed976eb6deaa7fd69f7e6853f8b57bf34a WatchSource:0}: Error finding container 594e405c27343c364a33f9e4072ecfed976eb6deaa7fd69f7e6853f8b57bf34a: Status 404 returned error can't find the container with id 594e405c27343c364a33f9e4072ecfed976eb6deaa7fd69f7e6853f8b57bf34a Apr 17 17:25:56.166248 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:56.166220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-775565c6cf-d4kzv" event={"ID":"9b1aa968-7a05-4027-a8d7-648be60980c0","Type":"ContainerStarted","Data":"594e405c27343c364a33f9e4072ecfed976eb6deaa7fd69f7e6853f8b57bf34a"} Apr 17 17:25:56.997861 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:56.997824 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-d69mc"] Apr 17 17:25:57.021911 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.021882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.024692 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.024668 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:25:57.024857 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.024708 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:25:57.024857 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.024673 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-29gws\"" Apr 17 17:25:57.024857 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.024760 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:25:57.089488 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-sys\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.089662 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-textfile\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.089662 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089596 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.089662 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089642 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-accelerators-collector-config\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.089776 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fxvt\" (UniqueName: \"kubernetes.io/projected/686baeef-d269-4500-9e3f-aed10f7f1c7d-kube-api-access-2fxvt\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.089776 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-root\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.089776 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/686baeef-d269-4500-9e3f-aed10f7f1c7d-metrics-client-ca\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.089776 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089758 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-wtmp\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.089924 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.089780 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.139476 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.139446 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zz2w5" Apr 17 17:25:57.171108 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.171073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-775565c6cf-d4kzv" event={"ID":"9b1aa968-7a05-4027-a8d7-648be60980c0","Type":"ContainerStarted","Data":"36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f"} Apr 17 17:25:57.191034 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.190997 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-textfile\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.191318 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.191300 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.191550 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.191512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-accelerators-collector-config\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.191713 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.191698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fxvt\" (UniqueName: \"kubernetes.io/projected/686baeef-d269-4500-9e3f-aed10f7f1c7d-kube-api-access-2fxvt\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.191843 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.191830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-root\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.192007 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.191985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/686baeef-d269-4500-9e3f-aed10f7f1c7d-metrics-client-ca\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.192231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.192215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-wtmp\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.192330 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.192317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.192481 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.192469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-sys\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.193651 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.193629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-textfile\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.193878 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:57.193863 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:25:57.194024 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:57.194015 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls podName:686baeef-d269-4500-9e3f-aed10f7f1c7d nodeName:}" failed. No retries permitted until 2026-04-17 17:25:57.693994588 +0000 UTC m=+53.362003146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls") pod "node-exporter-d69mc" (UID: "686baeef-d269-4500-9e3f-aed10f7f1c7d") : secret "node-exporter-tls" not found Apr 17 17:25:57.195176 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.195146 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-accelerators-collector-config\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.195613 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.195593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-root\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.196329 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.196310 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/686baeef-d269-4500-9e3f-aed10f7f1c7d-metrics-client-ca\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.197095 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.197050 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-wtmp\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.197540 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.197518 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/686baeef-d269-4500-9e3f-aed10f7f1c7d-sys\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.202929 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.202908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.208511 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.208484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fxvt\" (UniqueName: \"kubernetes.io/projected/686baeef-d269-4500-9e3f-aed10f7f1c7d-kube-api-access-2fxvt\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.695900 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:57.695863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:57.696087 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:57.696016 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:25:57.696131 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:25:57.696104 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls podName:686baeef-d269-4500-9e3f-aed10f7f1c7d nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.696085842 +0000 UTC m=+54.364094399 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls") pod "node-exporter-d69mc" (UID: "686baeef-d269-4500-9e3f-aed10f7f1c7d") : secret "node-exporter-tls" not found Apr 17 17:25:58.448707 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:58.448665 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:58.448707 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:58.448716 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:58.453341 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:58.453320 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:58.473099 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:58.473058 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-775565c6cf-d4kzv" podStartSLOduration=3.473045343 podStartE2EDuration="3.473045343s" podCreationTimestamp="2026-04-17 17:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:57.197462983 +0000 UTC m=+52.865471551" watchObservedRunningTime="2026-04-17 17:25:58.473045343 +0000 UTC m=+54.141053909" Apr 17 17:25:58.704318 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:58.704231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:58.706496 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:58.706473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/686baeef-d269-4500-9e3f-aed10f7f1c7d-node-exporter-tls\") pod \"node-exporter-d69mc\" (UID: \"686baeef-d269-4500-9e3f-aed10f7f1c7d\") " pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:58.831058 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:58.831023 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-d69mc" Apr 17 17:25:58.840167 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:58.840143 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686baeef_d269_4500_9e3f_aed10f7f1c7d.slice/crio-38c686afab8b2e0c66164b59a2187ba9b7a54ca7456348f8e27377f0309d9ffc WatchSource:0}: Error finding container 38c686afab8b2e0c66164b59a2187ba9b7a54ca7456348f8e27377f0309d9ffc: Status 404 returned error can't find the container with id 38c686afab8b2e0c66164b59a2187ba9b7a54ca7456348f8e27377f0309d9ffc Apr 17 17:25:59.005206 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.005115 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-669776895b-schm7"] Apr 17 17:25:59.019748 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.019719 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.022500 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.022475 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-669776895b-schm7"] Apr 17 17:25:59.022635 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.022550 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-kccrq\"" Apr 17 17:25:59.022820 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.022782 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 17:25:59.022937 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.022837 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 17:25:59.022937 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.022846 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 17:25:59.022937 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.022836 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-at207bb671u5p\"" Apr 17 17:25:59.022937 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.022838 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 17:25:59.023484 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.023470 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 17:25:59.107948 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.107912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-grpc-tls\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.108117 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.107964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.108117 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.108017 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a0915b4-8767-4493-ac81-7885fb3dd23a-metrics-client-ca\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.108117 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.108060 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52bbz\" (UniqueName: \"kubernetes.io/projected/9a0915b4-8767-4493-ac81-7885fb3dd23a-kube-api-access-52bbz\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.108117 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.108090 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.108261 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.108118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.108261 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.108162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-tls\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.108261 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.108191 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.177011 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.176974 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d69mc" event={"ID":"686baeef-d269-4500-9e3f-aed10f7f1c7d","Type":"ContainerStarted","Data":"38c686afab8b2e0c66164b59a2187ba9b7a54ca7456348f8e27377f0309d9ffc"} Apr 17 17:25:59.181177 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.181157 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:25:59.209102 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.209068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.209102 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.209101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a0915b4-8767-4493-ac81-7885fb3dd23a-metrics-client-ca\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.209343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.209144 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52bbz\" (UniqueName: \"kubernetes.io/projected/9a0915b4-8767-4493-ac81-7885fb3dd23a-kube-api-access-52bbz\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.209343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.209167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.209343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.209198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.209343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.209233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-tls\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.209343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.209259 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.209606 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.209500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-grpc-tls\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.210043 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.210011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a0915b4-8767-4493-ac81-7885fb3dd23a-metrics-client-ca\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.211974 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.211947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.212148 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.212126 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-tls\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.212232 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.212197 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.212232 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.212222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.212308 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.212281 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.212534 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.212515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/9a0915b4-8767-4493-ac81-7885fb3dd23a-secret-grpc-tls\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.222359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.222330 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52bbz\" (UniqueName: \"kubernetes.io/projected/9a0915b4-8767-4493-ac81-7885fb3dd23a-kube-api-access-52bbz\") pod \"thanos-querier-669776895b-schm7\" (UID: \"9a0915b4-8767-4493-ac81-7885fb3dd23a\") " pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.330534 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.330457 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:25:59.470627 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:25:59.470595 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-669776895b-schm7"] Apr 17 17:25:59.473634 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:25:59.473607 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0915b4_8767_4493_ac81_7885fb3dd23a.slice/crio-33f4f122787f1b26c14f179d5ffacd22f541a9b4ce14ea85b28292da4a6e6c28 WatchSource:0}: Error finding container 33f4f122787f1b26c14f179d5ffacd22f541a9b4ce14ea85b28292da4a6e6c28: Status 404 returned error can't find the container with id 33f4f122787f1b26c14f179d5ffacd22f541a9b4ce14ea85b28292da4a6e6c28 Apr 17 17:26:00.180853 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:00.180822 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-669776895b-schm7" event={"ID":"9a0915b4-8767-4493-ac81-7885fb3dd23a","Type":"ContainerStarted","Data":"33f4f122787f1b26c14f179d5ffacd22f541a9b4ce14ea85b28292da4a6e6c28"} Apr 17 17:26:01.186997 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.186960 2573 generic.go:358] "Generic (PLEG): container finished" podID="686baeef-d269-4500-9e3f-aed10f7f1c7d" containerID="268b4d35f55d93a2681e19ee1ec9632ef5d90b080eed800f244debafd8fdd90a" exitCode=0 Apr 17 17:26:01.187483 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.187023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d69mc" event={"ID":"686baeef-d269-4500-9e3f-aed10f7f1c7d","Type":"ContainerDied","Data":"268b4d35f55d93a2681e19ee1ec9632ef5d90b080eed800f244debafd8fdd90a"} Apr 17 17:26:01.424249 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.424208 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8"] Apr 17 17:26:01.454745 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.454676 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8"] Apr 17 17:26:01.454901 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.454811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.458044 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.457863 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-npnkc\"" Apr 17 17:26:01.458044 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.457879 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-91e3vc2frbckn\"" Apr 17 17:26:01.458044 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.457897 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 17:26:01.458044 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.457915 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 17:26:01.458044 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.457863 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 17:26:01.458044 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.457873 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:26:01.526152 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.526118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/90933178-05d1-4eff-9f6c-a4b256cf0f68-audit-log\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.526317 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.526182 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-client-ca-bundle\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.526317 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.526235 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-secret-metrics-server-tls\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.526317 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.526256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90933178-05d1-4eff-9f6c-a4b256cf0f68-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.526317 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.526284 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-secret-metrics-server-client-certs\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.526317 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.526305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/90933178-05d1-4eff-9f6c-a4b256cf0f68-metrics-server-audit-profiles\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.526522 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.526338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvlx6\" (UniqueName: \"kubernetes.io/projected/90933178-05d1-4eff-9f6c-a4b256cf0f68-kube-api-access-bvlx6\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.627358 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.627327 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-client-ca-bundle\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.627518 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.627381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-secret-metrics-server-tls\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.627518 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.627409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90933178-05d1-4eff-9f6c-a4b256cf0f68-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.627518 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.627467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-secret-metrics-server-client-certs\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.627518 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.627500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/90933178-05d1-4eff-9f6c-a4b256cf0f68-metrics-server-audit-profiles\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.627733 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.627564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvlx6\" (UniqueName: \"kubernetes.io/projected/90933178-05d1-4eff-9f6c-a4b256cf0f68-kube-api-access-bvlx6\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.627733 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.627604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/90933178-05d1-4eff-9f6c-a4b256cf0f68-audit-log\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.627971 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.627952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/90933178-05d1-4eff-9f6c-a4b256cf0f68-audit-log\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.628255 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.628236 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90933178-05d1-4eff-9f6c-a4b256cf0f68-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.628671 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.628644 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/90933178-05d1-4eff-9f6c-a4b256cf0f68-metrics-server-audit-profiles\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.629664 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.629640 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-client-ca-bundle\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.629955 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.629933 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-secret-metrics-server-client-certs\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.630029 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.629941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/90933178-05d1-4eff-9f6c-a4b256cf0f68-secret-metrics-server-tls\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.635867 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.635847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvlx6\" (UniqueName: \"kubernetes.io/projected/90933178-05d1-4eff-9f6c-a4b256cf0f68-kube-api-access-bvlx6\") pod \"metrics-server-6f8dbf5fdc-sfvf8\" (UID: \"90933178-05d1-4eff-9f6c-a4b256cf0f68\") " pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.762851 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.762812 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx"] Apr 17 17:26:01.764624 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.764587 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:01.772177 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.772149 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" Apr 17 17:26:01.774333 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.774309 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 17:26:01.774495 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.774381 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-dj7pf\"" Apr 17 17:26:01.774992 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.774966 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx"] Apr 17 17:26:01.829246 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.829217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/39b7ffe1-d40b-4c40-8bfa-72243d49873b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4d8sx\" (UID: \"39b7ffe1-d40b-4c40-8bfa-72243d49873b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" Apr 17 17:26:01.929813 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.929782 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/39b7ffe1-d40b-4c40-8bfa-72243d49873b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4d8sx\" (UID: \"39b7ffe1-d40b-4c40-8bfa-72243d49873b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" Apr 17 17:26:01.930298 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.930248 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8"] Apr 17 17:26:01.932910 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:01.932891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/39b7ffe1-d40b-4c40-8bfa-72243d49873b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4d8sx\" (UID: \"39b7ffe1-d40b-4c40-8bfa-72243d49873b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" Apr 17 17:26:01.934930 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:26:01.934907 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90933178_05d1_4eff_9f6c_a4b256cf0f68.slice/crio-9f880e8274b9b68a8311a2805fb3cd00305f149edd6e37e9988a65f6a70971fc WatchSource:0}: Error finding container 9f880e8274b9b68a8311a2805fb3cd00305f149edd6e37e9988a65f6a70971fc: Status 404 returned error can't find the container with id 9f880e8274b9b68a8311a2805fb3cd00305f149edd6e37e9988a65f6a70971fc Apr 17 17:26:02.081921 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:02.081896 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" Apr 17 17:26:02.192687 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:02.192655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-669776895b-schm7" event={"ID":"9a0915b4-8767-4493-ac81-7885fb3dd23a","Type":"ContainerStarted","Data":"72c9bbfe04c65d52b2c6d2e871158fb6f9c63a63a89ef2b790ac63ae6a5a1380"} Apr 17 17:26:02.192687 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:02.192692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-669776895b-schm7" event={"ID":"9a0915b4-8767-4493-ac81-7885fb3dd23a","Type":"ContainerStarted","Data":"0d662bd2339f53cb82f508747707a539af4c759ed532056a1066585fc3447364"} Apr 17 17:26:02.194710 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:02.194684 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d69mc" event={"ID":"686baeef-d269-4500-9e3f-aed10f7f1c7d","Type":"ContainerStarted","Data":"45988cef90ad4c111bec84ed9025073507b0281952c69ba0898162af5fbe4eb1"} Apr 17 17:26:02.194834 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:02.194716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-d69mc" event={"ID":"686baeef-d269-4500-9e3f-aed10f7f1c7d","Type":"ContainerStarted","Data":"fab62ce91b98c75f05247756fb95906e6076b07d69e8777bb73f73721551bd3e"} Apr 17 17:26:02.195835 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:02.195813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" event={"ID":"90933178-05d1-4eff-9f6c-a4b256cf0f68","Type":"ContainerStarted","Data":"9f880e8274b9b68a8311a2805fb3cd00305f149edd6e37e9988a65f6a70971fc"} Apr 17 17:26:02.211306 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:02.211282 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx"] Apr 17 17:26:02.214257 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:02.214211 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-d69mc" podStartSLOduration=4.8835633000000005 podStartE2EDuration="6.214193727s" podCreationTimestamp="2026-04-17 17:25:56 +0000 UTC" firstStartedPulling="2026-04-17 17:25:58.842262231 +0000 UTC m=+54.510270779" lastFinishedPulling="2026-04-17 17:26:00.172892655 +0000 UTC m=+55.840901206" observedRunningTime="2026-04-17 17:26:02.213708031 +0000 UTC m=+57.881716621" watchObservedRunningTime="2026-04-17 17:26:02.214193727 +0000 UTC m=+57.882202335" Apr 17 17:26:02.214727 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:26:02.214705 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b7ffe1_d40b_4c40_8bfa_72243d49873b.slice/crio-e3e1966208a7fe64bd51413ed46fa3b4e535e5ec4075c9d5e0e6282afd31ded0 WatchSource:0}: Error finding container e3e1966208a7fe64bd51413ed46fa3b4e535e5ec4075c9d5e0e6282afd31ded0: Status 404 returned error can't find the container with id e3e1966208a7fe64bd51413ed46fa3b4e535e5ec4075c9d5e0e6282afd31ded0 Apr 17 17:26:03.204520 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:03.204462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-669776895b-schm7" event={"ID":"9a0915b4-8767-4493-ac81-7885fb3dd23a","Type":"ContainerStarted","Data":"bc712618d8c49bafe936808f72ad143b52ed731b3d0433e4ed58232cac438d30"} Apr 17 17:26:03.206108 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:03.206083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" event={"ID":"39b7ffe1-d40b-4c40-8bfa-72243d49873b","Type":"ContainerStarted","Data":"e3e1966208a7fe64bd51413ed46fa3b4e535e5ec4075c9d5e0e6282afd31ded0"} Apr 17 17:26:04.100863 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.100826 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hr974" Apr 17 17:26:04.553438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.553388 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-775565c6cf-d4kzv"] Apr 17 17:26:04.587138 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.587102 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6bf9689f8d-g8d5w"] Apr 17 17:26:04.609220 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.609192 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bf9689f8d-g8d5w"] Apr 17 17:26:04.609385 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.609325 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.656162 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.656123 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-serving-cert\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.656162 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.656165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-service-ca\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.656345 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.656193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8cf9\" (UniqueName: \"kubernetes.io/projected/5faba003-cf23-4cd7-9b0c-f378f21505de-kube-api-access-k8cf9\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.656345 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.656268 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-console-config\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.656345 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.656310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-oauth-config\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.656463 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.656357 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-oauth-serving-cert\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.656463 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.656388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-trusted-ca-bundle\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.757109 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.757068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-trusted-ca-bundle\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.757194 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.757152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-serving-cert\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.757194 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.757187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-service-ca\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.757302 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.757220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8cf9\" (UniqueName: \"kubernetes.io/projected/5faba003-cf23-4cd7-9b0c-f378f21505de-kube-api-access-k8cf9\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.757355 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.757297 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-console-config\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.757355 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.757334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-oauth-config\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.757465 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.757400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-oauth-serving-cert\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.758650 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.758163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-oauth-serving-cert\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.758650 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.758606 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-trusted-ca-bundle\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.758786 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.758675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-console-config\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.759441 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.759401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-service-ca\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.760164 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.760139 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-serving-cert\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.760340 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.760313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-oauth-config\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.766699 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.766674 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8cf9\" (UniqueName: \"kubernetes.io/projected/5faba003-cf23-4cd7-9b0c-f378f21505de-kube-api-access-k8cf9\") pod \"console-6bf9689f8d-g8d5w\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:04.918459 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:04.918415 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:05.053014 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.052987 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6bf9689f8d-g8d5w"] Apr 17 17:26:05.056307 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:26:05.056271 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5faba003_cf23_4cd7_9b0c_f378f21505de.slice/crio-dcd64d0308e629326eaa48a67cf3f7940ef28bf327bf433a0a42730ce8def0b7 WatchSource:0}: Error finding container dcd64d0308e629326eaa48a67cf3f7940ef28bf327bf433a0a42730ce8def0b7: Status 404 returned error can't find the container with id dcd64d0308e629326eaa48a67cf3f7940ef28bf327bf433a0a42730ce8def0b7 Apr 17 17:26:05.213558 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.213520 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" event={"ID":"90933178-05d1-4eff-9f6c-a4b256cf0f68","Type":"ContainerStarted","Data":"320361ebf0d0610d8484c67e5fd98341647c25c5e21d0a68bc10af3704d89374"} Apr 17 17:26:05.215100 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.215069 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" event={"ID":"39b7ffe1-d40b-4c40-8bfa-72243d49873b","Type":"ContainerStarted","Data":"afbf3f5cc78ece2c1f130eba37891270a91535010f99c826d2fa3a342794f0fb"} Apr 17 17:26:05.215268 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.215251 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" Apr 17 17:26:05.216742 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.216717 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bf9689f8d-g8d5w" event={"ID":"5faba003-cf23-4cd7-9b0c-f378f21505de","Type":"ContainerStarted","Data":"cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f"} Apr 17 17:26:05.216848 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.216746 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bf9689f8d-g8d5w" event={"ID":"5faba003-cf23-4cd7-9b0c-f378f21505de","Type":"ContainerStarted","Data":"dcd64d0308e629326eaa48a67cf3f7940ef28bf327bf433a0a42730ce8def0b7"} Apr 17 17:26:05.219899 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.219872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-669776895b-schm7" event={"ID":"9a0915b4-8767-4493-ac81-7885fb3dd23a","Type":"ContainerStarted","Data":"c347e81135467fbc97b77c80820fa1711e9c700aee32e04bdd45973fe2c8e51b"} Apr 17 17:26:05.220014 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.219905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-669776895b-schm7" event={"ID":"9a0915b4-8767-4493-ac81-7885fb3dd23a","Type":"ContainerStarted","Data":"a62bd1c5d0cb81175581d8be8a819b0a0a79d5312ccd89dd578fdebffd7f59d7"} Apr 17 17:26:05.220014 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.219917 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-669776895b-schm7" event={"ID":"9a0915b4-8767-4493-ac81-7885fb3dd23a","Type":"ContainerStarted","Data":"1460409661b1af98798180bc201457c9267b43caeedac268e5e3e2b802c38942"} Apr 17 17:26:05.220131 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.220025 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:26:05.220701 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.220686 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" Apr 17 17:26:05.235190 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.235087 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" podStartSLOduration=1.483512561 podStartE2EDuration="4.235075023s" podCreationTimestamp="2026-04-17 17:26:01 +0000 UTC" firstStartedPulling="2026-04-17 17:26:01.937973512 +0000 UTC m=+57.605982060" lastFinishedPulling="2026-04-17 17:26:04.689535977 +0000 UTC m=+60.357544522" observedRunningTime="2026-04-17 17:26:05.23381119 +0000 UTC m=+60.901819755" watchObservedRunningTime="2026-04-17 17:26:05.235075023 +0000 UTC m=+60.903083590" Apr 17 17:26:05.259445 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.259374 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-669776895b-schm7" podStartSLOduration=2.044785085 podStartE2EDuration="7.259357809s" podCreationTimestamp="2026-04-17 17:25:58 +0000 UTC" firstStartedPulling="2026-04-17 17:25:59.475511198 +0000 UTC m=+55.143519743" lastFinishedPulling="2026-04-17 17:26:04.690083919 +0000 UTC m=+60.358092467" observedRunningTime="2026-04-17 17:26:05.258454512 +0000 UTC m=+60.926463079" watchObservedRunningTime="2026-04-17 17:26:05.259357809 +0000 UTC m=+60.927366375" Apr 17 17:26:05.276244 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.276198 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6bf9689f8d-g8d5w" podStartSLOduration=1.2761847579999999 podStartE2EDuration="1.276184758s" podCreationTimestamp="2026-04-17 17:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:05.275527056 +0000 UTC m=+60.943535623" watchObservedRunningTime="2026-04-17 17:26:05.276184758 +0000 UTC m=+60.944193316" Apr 17 17:26:05.297865 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:05.297806 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4d8sx" podStartSLOduration=1.821065495 podStartE2EDuration="4.297791445s" podCreationTimestamp="2026-04-17 17:26:01 +0000 UTC" firstStartedPulling="2026-04-17 17:26:02.216693393 +0000 UTC m=+57.884701939" lastFinishedPulling="2026-04-17 17:26:04.69341934 +0000 UTC m=+60.361427889" observedRunningTime="2026-04-17 17:26:05.296682102 +0000 UTC m=+60.964690672" watchObservedRunningTime="2026-04-17 17:26:05.297791445 +0000 UTC m=+60.965800012" Apr 17 17:26:06.040197 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.040158 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:26:06.721963 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.721927 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bf9689f8d-g8d5w"] Apr 17 17:26:06.759897 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.759867 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f459bb64b-klx5x"] Apr 17 17:26:06.790026 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.789995 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f459bb64b-klx5x"] Apr 17 17:26:06.790168 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.790102 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.877531 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.877497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-config\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.877698 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.877551 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-oauth-serving-cert\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.877698 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.877638 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-245qq\" (UniqueName: \"kubernetes.io/projected/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-kube-api-access-245qq\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.877698 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.877681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-serving-cert\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.877802 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.877725 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-service-ca\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.877802 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.877793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-oauth-config\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.877873 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.877812 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-trusted-ca-bundle\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.979188 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.979099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-245qq\" (UniqueName: \"kubernetes.io/projected/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-kube-api-access-245qq\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.979188 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.979140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-serving-cert\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.979188 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.979169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-service-ca\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.979504 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.979227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-oauth-config\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.979504 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.979249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-trusted-ca-bundle\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.979504 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.979295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-config\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.979504 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.979331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-oauth-serving-cert\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.980044 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.980021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-service-ca\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.980116 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.980027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-config\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.980267 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.980245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-trusted-ca-bundle\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.980384 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.980366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-oauth-serving-cert\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.981635 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.981612 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-oauth-config\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.981747 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.981729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-serving-cert\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:06.987900 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:06.987880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-245qq\" (UniqueName: \"kubernetes.io/projected/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-kube-api-access-245qq\") pod \"console-f459bb64b-klx5x\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:07.099555 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:07.099515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:07.219457 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:07.219415 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f459bb64b-klx5x"] Apr 17 17:26:07.221917 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:26:07.221889 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb594bc_3f1b_4c8a_897e_70f5038c4d1c.slice/crio-82061684d6f9e1c37eebe45bd7abebf630205155f5c556b3c75d1a7cef7579db WatchSource:0}: Error finding container 82061684d6f9e1c37eebe45bd7abebf630205155f5c556b3c75d1a7cef7579db: Status 404 returned error can't find the container with id 82061684d6f9e1c37eebe45bd7abebf630205155f5c556b3c75d1a7cef7579db Apr 17 17:26:07.227224 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:07.227197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f459bb64b-klx5x" event={"ID":"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c","Type":"ContainerStarted","Data":"82061684d6f9e1c37eebe45bd7abebf630205155f5c556b3c75d1a7cef7579db"} Apr 17 17:26:08.233070 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:08.232810 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f459bb64b-klx5x" event={"ID":"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c","Type":"ContainerStarted","Data":"c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d"} Apr 17 17:26:08.254324 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:08.254260 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f459bb64b-klx5x" podStartSLOduration=2.254239126 podStartE2EDuration="2.254239126s" podCreationTimestamp="2026-04-17 17:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:08.252194614 +0000 UTC m=+63.920203183" watchObservedRunningTime="2026-04-17 17:26:08.254239126 +0000 UTC m=+63.922247694" Apr 17 17:26:10.614375 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.614334 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:26:10.617179 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.617154 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:26:10.627666 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.627642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/54c39df0-963a-429e-b7e9-1cf754453932-metrics-certs\") pod \"network-metrics-daemon-6hw86\" (UID: \"54c39df0-963a-429e-b7e9-1cf754453932\") " pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:26:10.632017 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.631997 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vr6gd\"" Apr 17 17:26:10.640075 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.640055 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6hw86" Apr 17 17:26:10.715358 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.715270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:26:10.718026 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.718003 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:26:10.729065 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.729036 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:26:10.739003 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.738979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vkh2\" (UniqueName: \"kubernetes.io/projected/3dbf031f-03a8-4194-a694-20fe7307d30f-kube-api-access-6vkh2\") pod \"network-check-target-56d9d\" (UID: \"3dbf031f-03a8-4194-a694-20fe7307d30f\") " pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:26:10.763575 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.763478 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6hw86"] Apr 17 17:26:10.766215 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:26:10.766190 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54c39df0_963a_429e_b7e9_1cf754453932.slice/crio-be64633cb0a6b44f084a35a1b2ee52d31877cb33b66a9b658a606b8983fb2a5c WatchSource:0}: Error finding container be64633cb0a6b44f084a35a1b2ee52d31877cb33b66a9b658a606b8983fb2a5c: Status 404 returned error can't find the container with id be64633cb0a6b44f084a35a1b2ee52d31877cb33b66a9b658a606b8983fb2a5c Apr 17 17:26:10.936563 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.936531 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bp9pv\"" Apr 17 17:26:10.944539 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:10.944519 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:26:11.059117 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:11.059085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-56d9d"] Apr 17 17:26:11.063342 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:26:11.063314 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dbf031f_03a8_4194_a694_20fe7307d30f.slice/crio-692a2698bbaaaaec66996865e3cb485add0e7db558fa9dbb1ae4aa9ff5464055 WatchSource:0}: Error finding container 692a2698bbaaaaec66996865e3cb485add0e7db558fa9dbb1ae4aa9ff5464055: Status 404 returned error can't find the container with id 692a2698bbaaaaec66996865e3cb485add0e7db558fa9dbb1ae4aa9ff5464055 Apr 17 17:26:11.231410 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:11.231338 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-669776895b-schm7" Apr 17 17:26:11.243622 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:11.243583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-56d9d" event={"ID":"3dbf031f-03a8-4194-a694-20fe7307d30f","Type":"ContainerStarted","Data":"692a2698bbaaaaec66996865e3cb485add0e7db558fa9dbb1ae4aa9ff5464055"} Apr 17 17:26:11.244802 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:11.244765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6hw86" event={"ID":"54c39df0-963a-429e-b7e9-1cf754453932","Type":"ContainerStarted","Data":"be64633cb0a6b44f084a35a1b2ee52d31877cb33b66a9b658a606b8983fb2a5c"} Apr 17 17:26:12.250289 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:12.250246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6hw86" event={"ID":"54c39df0-963a-429e-b7e9-1cf754453932","Type":"ContainerStarted","Data":"f8f50176666056b445288d0c2bcc7cc3391e5686e07dc3e8acc9eedfa1986fbe"} Apr 17 17:26:12.250289 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:12.250295 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6hw86" event={"ID":"54c39df0-963a-429e-b7e9-1cf754453932","Type":"ContainerStarted","Data":"474466d29e47d525430c1f6e0662a266faac29c971e5856701c8d2c3788588fd"} Apr 17 17:26:12.270286 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:12.270225 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6hw86" podStartSLOduration=66.238734829 podStartE2EDuration="1m7.270209787s" podCreationTimestamp="2026-04-17 17:25:05 +0000 UTC" firstStartedPulling="2026-04-17 17:26:10.768459822 +0000 UTC m=+66.436468367" lastFinishedPulling="2026-04-17 17:26:11.799934768 +0000 UTC m=+67.467943325" observedRunningTime="2026-04-17 17:26:12.265798 +0000 UTC m=+67.933806581" watchObservedRunningTime="2026-04-17 17:26:12.270209787 +0000 UTC m=+67.938218353" Apr 17 17:26:14.257794 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:14.257707 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-56d9d" event={"ID":"3dbf031f-03a8-4194-a694-20fe7307d30f","Type":"ContainerStarted","Data":"735ce817625861c8e2ed3ab93df9ac6d858d9f8d9f612c10eaec4d61fb1ea9b7"} Apr 17 17:26:14.258162 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:14.257938 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:26:14.276051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:14.275966 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-56d9d" podStartSLOduration=67.390547053 podStartE2EDuration="1m10.275950608s" podCreationTimestamp="2026-04-17 17:25:04 +0000 UTC" firstStartedPulling="2026-04-17 17:26:11.065195287 +0000 UTC m=+66.733203846" lastFinishedPulling="2026-04-17 17:26:13.950598857 +0000 UTC m=+69.618607401" observedRunningTime="2026-04-17 17:26:14.275787966 +0000 UTC m=+69.943796533" watchObservedRunningTime="2026-04-17 17:26:14.275950608 +0000 UTC m=+69.943959176" Apr 17 17:26:14.919049 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:14.919016 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:17.100472 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:17.100413 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:17.100964 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:17.100483 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:17.105208 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:17.105184 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:17.271809 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:17.271777 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:26:17.329710 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:17.329678 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bf5cbf9f4-bxhqx"] Apr 17 17:26:21.765596 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:21.765549 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:21.765596 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:21.765602 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:29.572138 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.572073 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-775565c6cf-d4kzv" podUID="9b1aa968-7a05-4027-a8d7-648be60980c0" containerName="console" containerID="cri-o://36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f" gracePeriod=15 Apr 17 17:26:29.819190 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.819164 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-775565c6cf-d4kzv_9b1aa968-7a05-4027-a8d7-648be60980c0/console/0.log" Apr 17 17:26:29.819305 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.819238 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:26:29.898188 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898151 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-console-config\") pod \"9b1aa968-7a05-4027-a8d7-648be60980c0\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " Apr 17 17:26:29.898343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898206 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-oauth-config\") pod \"9b1aa968-7a05-4027-a8d7-648be60980c0\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " Apr 17 17:26:29.898343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898262 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq2v8\" (UniqueName: \"kubernetes.io/projected/9b1aa968-7a05-4027-a8d7-648be60980c0-kube-api-access-xq2v8\") pod \"9b1aa968-7a05-4027-a8d7-648be60980c0\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " Apr 17 17:26:29.898343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898299 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-serving-cert\") pod \"9b1aa968-7a05-4027-a8d7-648be60980c0\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " Apr 17 17:26:29.898343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898327 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-oauth-serving-cert\") pod \"9b1aa968-7a05-4027-a8d7-648be60980c0\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " Apr 17 17:26:29.898569 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898357 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-service-ca\") pod \"9b1aa968-7a05-4027-a8d7-648be60980c0\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " Apr 17 17:26:29.898569 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898403 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-trusted-ca-bundle\") pod \"9b1aa968-7a05-4027-a8d7-648be60980c0\" (UID: \"9b1aa968-7a05-4027-a8d7-648be60980c0\") " Apr 17 17:26:29.898569 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898524 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-console-config" (OuterVolumeSpecName: "console-config") pod "9b1aa968-7a05-4027-a8d7-648be60980c0" (UID: "9b1aa968-7a05-4027-a8d7-648be60980c0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:29.898728 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898659 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-console-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:29.898843 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898810 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9b1aa968-7a05-4027-a8d7-648be60980c0" (UID: "9b1aa968-7a05-4027-a8d7-648be60980c0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:29.898843 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898824 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9b1aa968-7a05-4027-a8d7-648be60980c0" (UID: "9b1aa968-7a05-4027-a8d7-648be60980c0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:29.899020 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.898836 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-service-ca" (OuterVolumeSpecName: "service-ca") pod "9b1aa968-7a05-4027-a8d7-648be60980c0" (UID: "9b1aa968-7a05-4027-a8d7-648be60980c0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:29.900609 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.900587 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9b1aa968-7a05-4027-a8d7-648be60980c0" (UID: "9b1aa968-7a05-4027-a8d7-648be60980c0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:29.900691 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.900643 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1aa968-7a05-4027-a8d7-648be60980c0-kube-api-access-xq2v8" (OuterVolumeSpecName: "kube-api-access-xq2v8") pod "9b1aa968-7a05-4027-a8d7-648be60980c0" (UID: "9b1aa968-7a05-4027-a8d7-648be60980c0"). InnerVolumeSpecName "kube-api-access-xq2v8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:29.900730 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.900691 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9b1aa968-7a05-4027-a8d7-648be60980c0" (UID: "9b1aa968-7a05-4027-a8d7-648be60980c0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:30.003052 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.999629 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-trusted-ca-bundle\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:30.003052 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.999713 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-oauth-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:30.003052 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.999731 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xq2v8\" (UniqueName: \"kubernetes.io/projected/9b1aa968-7a05-4027-a8d7-648be60980c0-kube-api-access-xq2v8\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:30.003052 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.999749 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b1aa968-7a05-4027-a8d7-648be60980c0-console-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:30.003052 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.999764 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-oauth-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:30.003052 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:29.999786 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9b1aa968-7a05-4027-a8d7-648be60980c0-service-ca\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:30.304006 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.303933 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-775565c6cf-d4kzv_9b1aa968-7a05-4027-a8d7-648be60980c0/console/0.log" Apr 17 17:26:30.304006 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.303973 2573 generic.go:358] "Generic (PLEG): container finished" podID="9b1aa968-7a05-4027-a8d7-648be60980c0" containerID="36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f" exitCode=2 Apr 17 17:26:30.304177 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.304057 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-775565c6cf-d4kzv" Apr 17 17:26:30.304177 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.304060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-775565c6cf-d4kzv" event={"ID":"9b1aa968-7a05-4027-a8d7-648be60980c0","Type":"ContainerDied","Data":"36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f"} Apr 17 17:26:30.304177 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.304100 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-775565c6cf-d4kzv" event={"ID":"9b1aa968-7a05-4027-a8d7-648be60980c0","Type":"ContainerDied","Data":"594e405c27343c364a33f9e4072ecfed976eb6deaa7fd69f7e6853f8b57bf34a"} Apr 17 17:26:30.304177 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.304117 2573 scope.go:117] "RemoveContainer" containerID="36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f" Apr 17 17:26:30.312630 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.312613 2573 scope.go:117] "RemoveContainer" containerID="36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f" Apr 17 17:26:30.312950 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:26:30.312922 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f\": container with ID starting with 36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f not found: ID does not exist" containerID="36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f" Apr 17 17:26:30.313020 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.312958 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f"} err="failed to get container status \"36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f\": rpc error: code = NotFound desc = could not find container \"36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f\": container with ID starting with 36dfacc3d46d600fc60b8ae3afc5aae7da617da887335fa3ef7b0df9b6136c3f not found: ID does not exist" Apr 17 17:26:30.325377 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.325350 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-775565c6cf-d4kzv"] Apr 17 17:26:30.328912 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.328891 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-775565c6cf-d4kzv"] Apr 17 17:26:30.924187 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:30.924147 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1aa968-7a05-4027-a8d7-648be60980c0" path="/var/lib/kubelet/pods/9b1aa968-7a05-4027-a8d7-648be60980c0/volumes" Apr 17 17:26:32.245793 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.245756 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6bf9689f8d-g8d5w" podUID="5faba003-cf23-4cd7-9b0c-f378f21505de" containerName="console" containerID="cri-o://cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f" gracePeriod=15 Apr 17 17:26:32.482990 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.482970 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bf9689f8d-g8d5w_5faba003-cf23-4cd7-9b0c-f378f21505de/console/0.log" Apr 17 17:26:32.483102 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.483025 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:32.517063 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.516987 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-console-config\") pod \"5faba003-cf23-4cd7-9b0c-f378f21505de\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " Apr 17 17:26:32.517176 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517063 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-service-ca\") pod \"5faba003-cf23-4cd7-9b0c-f378f21505de\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " Apr 17 17:26:32.517176 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517111 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-oauth-config\") pod \"5faba003-cf23-4cd7-9b0c-f378f21505de\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " Apr 17 17:26:32.517176 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517152 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8cf9\" (UniqueName: \"kubernetes.io/projected/5faba003-cf23-4cd7-9b0c-f378f21505de-kube-api-access-k8cf9\") pod \"5faba003-cf23-4cd7-9b0c-f378f21505de\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " Apr 17 17:26:32.517335 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517177 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-trusted-ca-bundle\") pod \"5faba003-cf23-4cd7-9b0c-f378f21505de\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " Apr 17 17:26:32.517335 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517217 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-serving-cert\") pod \"5faba003-cf23-4cd7-9b0c-f378f21505de\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " Apr 17 17:26:32.517547 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517399 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-console-config" (OuterVolumeSpecName: "console-config") pod "5faba003-cf23-4cd7-9b0c-f378f21505de" (UID: "5faba003-cf23-4cd7-9b0c-f378f21505de"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:32.517547 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517527 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-service-ca" (OuterVolumeSpecName: "service-ca") pod "5faba003-cf23-4cd7-9b0c-f378f21505de" (UID: "5faba003-cf23-4cd7-9b0c-f378f21505de"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:32.517708 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517557 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-oauth-serving-cert\") pod \"5faba003-cf23-4cd7-9b0c-f378f21505de\" (UID: \"5faba003-cf23-4cd7-9b0c-f378f21505de\") " Apr 17 17:26:32.517766 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517726 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5faba003-cf23-4cd7-9b0c-f378f21505de" (UID: "5faba003-cf23-4cd7-9b0c-f378f21505de"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:32.517903 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517865 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-console-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:32.517903 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517890 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-service-ca\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:32.518004 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517904 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-trusted-ca-bundle\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:32.518004 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.517907 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5faba003-cf23-4cd7-9b0c-f378f21505de" (UID: "5faba003-cf23-4cd7-9b0c-f378f21505de"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:32.519374 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.519352 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5faba003-cf23-4cd7-9b0c-f378f21505de-kube-api-access-k8cf9" (OuterVolumeSpecName: "kube-api-access-k8cf9") pod "5faba003-cf23-4cd7-9b0c-f378f21505de" (UID: "5faba003-cf23-4cd7-9b0c-f378f21505de"). InnerVolumeSpecName "kube-api-access-k8cf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:32.519481 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.519378 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5faba003-cf23-4cd7-9b0c-f378f21505de" (UID: "5faba003-cf23-4cd7-9b0c-f378f21505de"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:32.519664 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.519648 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5faba003-cf23-4cd7-9b0c-f378f21505de" (UID: "5faba003-cf23-4cd7-9b0c-f378f21505de"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:32.619231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.619201 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-oauth-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:32.619231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.619226 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8cf9\" (UniqueName: \"kubernetes.io/projected/5faba003-cf23-4cd7-9b0c-f378f21505de-kube-api-access-k8cf9\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:32.619231 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.619236 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5faba003-cf23-4cd7-9b0c-f378f21505de-console-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:32.619466 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:32.619245 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5faba003-cf23-4cd7-9b0c-f378f21505de-oauth-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:33.314734 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.314657 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6bf9689f8d-g8d5w_5faba003-cf23-4cd7-9b0c-f378f21505de/console/0.log" Apr 17 17:26:33.314734 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.314698 2573 generic.go:358] "Generic (PLEG): container finished" podID="5faba003-cf23-4cd7-9b0c-f378f21505de" containerID="cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f" exitCode=2 Apr 17 17:26:33.315113 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.314754 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bf9689f8d-g8d5w" event={"ID":"5faba003-cf23-4cd7-9b0c-f378f21505de","Type":"ContainerDied","Data":"cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f"} Apr 17 17:26:33.315113 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.314765 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6bf9689f8d-g8d5w" Apr 17 17:26:33.315113 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.314788 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6bf9689f8d-g8d5w" event={"ID":"5faba003-cf23-4cd7-9b0c-f378f21505de","Type":"ContainerDied","Data":"dcd64d0308e629326eaa48a67cf3f7940ef28bf327bf433a0a42730ce8def0b7"} Apr 17 17:26:33.315113 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.314807 2573 scope.go:117] "RemoveContainer" containerID="cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f" Apr 17 17:26:33.321953 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.321931 2573 scope.go:117] "RemoveContainer" containerID="cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f" Apr 17 17:26:33.322179 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:26:33.322163 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f\": container with ID starting with cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f not found: ID does not exist" containerID="cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f" Apr 17 17:26:33.322235 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.322186 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f"} err="failed to get container status \"cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f\": rpc error: code = NotFound desc = could not find container \"cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f\": container with ID starting with cbfe114b05fcd15fea48f03c3930a43f12460149aec966c3d2b2e2b28883326f not found: ID does not exist" Apr 17 17:26:33.331933 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.331911 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6bf9689f8d-g8d5w"] Apr 17 17:26:33.336089 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:33.336066 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6bf9689f8d-g8d5w"] Apr 17 17:26:34.924313 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:34.924280 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5faba003-cf23-4cd7-9b0c-f378f21505de" path="/var/lib/kubelet/pods/5faba003-cf23-4cd7-9b0c-f378f21505de/volumes" Apr 17 17:26:41.770638 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:41.770608 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:41.774532 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:41.774509 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6f8dbf5fdc-sfvf8" Apr 17 17:26:42.352567 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.352530 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5bf5cbf9f4-bxhqx" podUID="49a11a4a-ef4f-4e56-bd4d-c04199e78d20" containerName="console" containerID="cri-o://1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063" gracePeriod=15 Apr 17 17:26:42.602843 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.602787 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bf5cbf9f4-bxhqx_49a11a4a-ef4f-4e56-bd4d-c04199e78d20/console/0.log" Apr 17 17:26:42.602946 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.602845 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:26:42.694966 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.694931 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h9zt\" (UniqueName: \"kubernetes.io/projected/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-kube-api-access-2h9zt\") pod \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " Apr 17 17:26:42.694966 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.694966 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-oauth-serving-cert\") pod \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " Apr 17 17:26:42.695211 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.695004 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-service-ca\") pod \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " Apr 17 17:26:42.695211 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.695021 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-config\") pod \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " Apr 17 17:26:42.695211 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.695050 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-oauth-config\") pod \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " Apr 17 17:26:42.695211 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.695091 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-serving-cert\") pod \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\" (UID: \"49a11a4a-ef4f-4e56-bd4d-c04199e78d20\") " Apr 17 17:26:42.695401 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.695360 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-config" (OuterVolumeSpecName: "console-config") pod "49a11a4a-ef4f-4e56-bd4d-c04199e78d20" (UID: "49a11a4a-ef4f-4e56-bd4d-c04199e78d20"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:42.695401 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.695371 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-service-ca" (OuterVolumeSpecName: "service-ca") pod "49a11a4a-ef4f-4e56-bd4d-c04199e78d20" (UID: "49a11a4a-ef4f-4e56-bd4d-c04199e78d20"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:42.695521 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.695473 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "49a11a4a-ef4f-4e56-bd4d-c04199e78d20" (UID: "49a11a4a-ef4f-4e56-bd4d-c04199e78d20"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:42.697270 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.697242 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "49a11a4a-ef4f-4e56-bd4d-c04199e78d20" (UID: "49a11a4a-ef4f-4e56-bd4d-c04199e78d20"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:42.697384 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.697349 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "49a11a4a-ef4f-4e56-bd4d-c04199e78d20" (UID: "49a11a4a-ef4f-4e56-bd4d-c04199e78d20"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:42.697384 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.697356 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-kube-api-access-2h9zt" (OuterVolumeSpecName: "kube-api-access-2h9zt") pod "49a11a4a-ef4f-4e56-bd4d-c04199e78d20" (UID: "49a11a4a-ef4f-4e56-bd4d-c04199e78d20"). InnerVolumeSpecName "kube-api-access-2h9zt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:42.800620 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.796576 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2h9zt\" (UniqueName: \"kubernetes.io/projected/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-kube-api-access-2h9zt\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:42.800620 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.796884 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-oauth-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:42.800620 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.796915 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-service-ca\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:42.800620 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.796931 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:42.800620 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.796953 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-oauth-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:42.800620 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:42.796968 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/49a11a4a-ef4f-4e56-bd4d-c04199e78d20-console-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:26:43.344279 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.344202 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5bf5cbf9f4-bxhqx_49a11a4a-ef4f-4e56-bd4d-c04199e78d20/console/0.log" Apr 17 17:26:43.344279 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.344245 2573 generic.go:358] "Generic (PLEG): container finished" podID="49a11a4a-ef4f-4e56-bd4d-c04199e78d20" containerID="1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063" exitCode=2 Apr 17 17:26:43.344561 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.344281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bf5cbf9f4-bxhqx" event={"ID":"49a11a4a-ef4f-4e56-bd4d-c04199e78d20","Type":"ContainerDied","Data":"1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063"} Apr 17 17:26:43.344561 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.344308 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bf5cbf9f4-bxhqx" event={"ID":"49a11a4a-ef4f-4e56-bd4d-c04199e78d20","Type":"ContainerDied","Data":"760e918cc95b110018512dde27b39cf8ceabcfae3d8094fe3d6a1533f77915ca"} Apr 17 17:26:43.344561 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.344341 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bf5cbf9f4-bxhqx" Apr 17 17:26:43.344561 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.344332 2573 scope.go:117] "RemoveContainer" containerID="1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063" Apr 17 17:26:43.352081 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.352063 2573 scope.go:117] "RemoveContainer" containerID="1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063" Apr 17 17:26:43.352332 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:26:43.352311 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063\": container with ID starting with 1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063 not found: ID does not exist" containerID="1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063" Apr 17 17:26:43.352399 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.352339 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063"} err="failed to get container status \"1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063\": rpc error: code = NotFound desc = could not find container \"1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063\": container with ID starting with 1e197682ecca7696d10bc6f79670ac10e96a393445ef689fb652ce9eb9638063 not found: ID does not exist" Apr 17 17:26:43.360960 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.360929 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5bf5cbf9f4-bxhqx"] Apr 17 17:26:43.364524 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:43.364500 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5bf5cbf9f4-bxhqx"] Apr 17 17:26:44.924744 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:44.924711 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a11a4a-ef4f-4e56-bd4d-c04199e78d20" path="/var/lib/kubelet/pods/49a11a4a-ef4f-4e56-bd4d-c04199e78d20/volumes" Apr 17 17:26:45.262887 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:26:45.262809 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-56d9d" Apr 17 17:27:16.894121 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894090 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-964f886d4-458t4"] Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894351 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b1aa968-7a05-4027-a8d7-648be60980c0" containerName="console" Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894363 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1aa968-7a05-4027-a8d7-648be60980c0" containerName="console" Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894380 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49a11a4a-ef4f-4e56-bd4d-c04199e78d20" containerName="console" Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894386 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a11a4a-ef4f-4e56-bd4d-c04199e78d20" containerName="console" Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894393 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5faba003-cf23-4cd7-9b0c-f378f21505de" containerName="console" Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894398 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5faba003-cf23-4cd7-9b0c-f378f21505de" containerName="console" Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894450 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5faba003-cf23-4cd7-9b0c-f378f21505de" containerName="console" Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894457 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b1aa968-7a05-4027-a8d7-648be60980c0" containerName="console" Apr 17 17:27:16.894583 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.894465 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="49a11a4a-ef4f-4e56-bd4d-c04199e78d20" containerName="console" Apr 17 17:27:16.897073 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.897056 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:16.913589 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:16.913564 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-964f886d4-458t4"] Apr 17 17:27:17.035593 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.035554 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-oauth-config\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.035781 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.035620 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-trusted-ca-bundle\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.035781 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.035645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrzqx\" (UniqueName: \"kubernetes.io/projected/ae91566e-7930-4d06-9bb2-1f6e940120f1-kube-api-access-rrzqx\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.035781 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.035670 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-config\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.035781 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.035742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-serving-cert\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.035934 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.035811 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-service-ca\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.035934 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.035850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-oauth-serving-cert\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.136850 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.136814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-oauth-serving-cert\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137021 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.136880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-oauth-config\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137021 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.136922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-trusted-ca-bundle\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137021 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.136944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrzqx\" (UniqueName: \"kubernetes.io/projected/ae91566e-7930-4d06-9bb2-1f6e940120f1-kube-api-access-rrzqx\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137021 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.136970 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-config\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137021 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.137005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-serving-cert\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137265 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.137031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-service-ca\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137695 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.137669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-oauth-serving-cert\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137807 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.137717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-config\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137807 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.137669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-service-ca\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.137974 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.137954 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-trusted-ca-bundle\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.139511 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.139492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-serving-cert\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.139603 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.139515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-oauth-config\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.154019 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.153951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrzqx\" (UniqueName: \"kubernetes.io/projected/ae91566e-7930-4d06-9bb2-1f6e940120f1-kube-api-access-rrzqx\") pod \"console-964f886d4-458t4\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.206065 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.206008 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:17.335155 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.335122 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-964f886d4-458t4"] Apr 17 17:27:17.338492 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:27:17.338459 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae91566e_7930_4d06_9bb2_1f6e940120f1.slice/crio-641bb2048fa55621da9d4bcb1b6513f1115b29bdc936506288eef130f604b2a4 WatchSource:0}: Error finding container 641bb2048fa55621da9d4bcb1b6513f1115b29bdc936506288eef130f604b2a4: Status 404 returned error can't find the container with id 641bb2048fa55621da9d4bcb1b6513f1115b29bdc936506288eef130f604b2a4 Apr 17 17:27:17.437033 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.436945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-964f886d4-458t4" event={"ID":"ae91566e-7930-4d06-9bb2-1f6e940120f1","Type":"ContainerStarted","Data":"2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113"} Apr 17 17:27:17.437033 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.436987 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-964f886d4-458t4" event={"ID":"ae91566e-7930-4d06-9bb2-1f6e940120f1","Type":"ContainerStarted","Data":"641bb2048fa55621da9d4bcb1b6513f1115b29bdc936506288eef130f604b2a4"} Apr 17 17:27:17.458301 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:17.458249 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-964f886d4-458t4" podStartSLOduration=1.458235129 podStartE2EDuration="1.458235129s" podCreationTimestamp="2026-04-17 17:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:27:17.45685503 +0000 UTC m=+133.124863593" watchObservedRunningTime="2026-04-17 17:27:17.458235129 +0000 UTC m=+133.126243696" Apr 17 17:27:27.207172 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:27.207124 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:27.207172 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:27.207172 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:27.211680 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:27.211658 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:27.466308 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:27.466225 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-964f886d4-458t4" Apr 17 17:27:27.515221 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:27.515186 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f459bb64b-klx5x"] Apr 17 17:27:40.482498 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.482460 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-b4bv4"] Apr 17 17:27:40.485642 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.485622 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.488120 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.488098 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:27:40.493978 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.493956 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-b4bv4"] Apr 17 17:27:40.614405 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.614370 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7bea17df-5d4f-4d36-a769-2336b30abea8-original-pull-secret\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.614405 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.614411 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7bea17df-5d4f-4d36-a769-2336b30abea8-dbus\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.614614 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.614453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7bea17df-5d4f-4d36-a769-2336b30abea8-kubelet-config\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.714846 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.714809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7bea17df-5d4f-4d36-a769-2336b30abea8-original-pull-secret\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.715000 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.714853 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7bea17df-5d4f-4d36-a769-2336b30abea8-dbus\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.715000 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.714872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7bea17df-5d4f-4d36-a769-2336b30abea8-kubelet-config\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.715000 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.714960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7bea17df-5d4f-4d36-a769-2336b30abea8-kubelet-config\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.715095 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.715025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7bea17df-5d4f-4d36-a769-2336b30abea8-dbus\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.717002 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.716974 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7bea17df-5d4f-4d36-a769-2336b30abea8-original-pull-secret\") pod \"global-pull-secret-syncer-b4bv4\" (UID: \"7bea17df-5d4f-4d36-a769-2336b30abea8\") " pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.795031 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.794947 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b4bv4" Apr 17 17:27:40.906108 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:40.906071 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-b4bv4"] Apr 17 17:27:40.909247 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:27:40.909219 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bea17df_5d4f_4d36_a769_2336b30abea8.slice/crio-c69705f3c6fdb452188ce71f55e2f5556cfe91bf8b973a34630e54711e6843ef WatchSource:0}: Error finding container c69705f3c6fdb452188ce71f55e2f5556cfe91bf8b973a34630e54711e6843ef: Status 404 returned error can't find the container with id c69705f3c6fdb452188ce71f55e2f5556cfe91bf8b973a34630e54711e6843ef Apr 17 17:27:41.504563 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:41.504527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-b4bv4" event={"ID":"7bea17df-5d4f-4d36-a769-2336b30abea8","Type":"ContainerStarted","Data":"c69705f3c6fdb452188ce71f55e2f5556cfe91bf8b973a34630e54711e6843ef"} Apr 17 17:27:45.517266 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:45.517191 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-b4bv4" event={"ID":"7bea17df-5d4f-4d36-a769-2336b30abea8","Type":"ContainerStarted","Data":"ea84c97cd67538c980625fa282fe0801d9d06a840b7960f1999decf6ab158f80"} Apr 17 17:27:45.537022 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:45.536976 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-b4bv4" podStartSLOduration=1.804206177 podStartE2EDuration="5.536963004s" podCreationTimestamp="2026-04-17 17:27:40 +0000 UTC" firstStartedPulling="2026-04-17 17:27:40.911272679 +0000 UTC m=+156.579281223" lastFinishedPulling="2026-04-17 17:27:44.644029493 +0000 UTC m=+160.312038050" observedRunningTime="2026-04-17 17:27:45.535474514 +0000 UTC m=+161.203483080" watchObservedRunningTime="2026-04-17 17:27:45.536963004 +0000 UTC m=+161.204971570" Apr 17 17:27:52.534067 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.534003 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f459bb64b-klx5x" podUID="0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" containerName="console" containerID="cri-o://c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d" gracePeriod=15 Apr 17 17:27:52.778111 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.778087 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f459bb64b-klx5x_0cb594bc-3f1b-4c8a-897e-70f5038c4d1c/console/0.log" Apr 17 17:27:52.778229 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.778150 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:27:52.797371 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.797288 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-trusted-ca-bundle\") pod \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " Apr 17 17:27:52.797371 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.797349 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-config\") pod \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " Apr 17 17:27:52.797577 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.797372 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-oauth-serving-cert\") pod \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " Apr 17 17:27:52.797577 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.797412 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-oauth-config\") pod \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " Apr 17 17:27:52.797577 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.797464 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-245qq\" (UniqueName: \"kubernetes.io/projected/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-kube-api-access-245qq\") pod \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " Apr 17 17:27:52.797577 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.797498 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-serving-cert\") pod \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " Apr 17 17:27:52.797577 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.797530 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-service-ca\") pod \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\" (UID: \"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c\") " Apr 17 17:27:52.798062 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.797855 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-config" (OuterVolumeSpecName: "console-config") pod "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" (UID: "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:52.798187 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.798096 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-service-ca" (OuterVolumeSpecName: "service-ca") pod "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" (UID: "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:52.798586 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.798483 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" (UID: "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:52.798586 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.798553 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" (UID: "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:52.800378 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.800336 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" (UID: "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:52.800488 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.800393 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-kube-api-access-245qq" (OuterVolumeSpecName: "kube-api-access-245qq") pod "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" (UID: "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c"). InnerVolumeSpecName "kube-api-access-245qq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:27:52.801264 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.801237 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" (UID: "0cb594bc-3f1b-4c8a-897e-70f5038c4d1c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:52.898861 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.898818 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:27:52.898861 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.898850 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-service-ca\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:27:52.898861 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.898864 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-trusted-ca-bundle\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:27:52.899119 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.898878 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:27:52.899119 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.898890 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-oauth-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:27:52.899119 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.898899 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-console-oauth-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:27:52.899119 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:52.898907 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-245qq\" (UniqueName: \"kubernetes.io/projected/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c-kube-api-access-245qq\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:27:53.539967 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.539938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f459bb64b-klx5x_0cb594bc-3f1b-4c8a-897e-70f5038c4d1c/console/0.log" Apr 17 17:27:53.540348 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.539975 2573 generic.go:358] "Generic (PLEG): container finished" podID="0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" containerID="c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d" exitCode=2 Apr 17 17:27:53.540348 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.540029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f459bb64b-klx5x" event={"ID":"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c","Type":"ContainerDied","Data":"c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d"} Apr 17 17:27:53.540348 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.540038 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f459bb64b-klx5x" Apr 17 17:27:53.540348 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.540055 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f459bb64b-klx5x" event={"ID":"0cb594bc-3f1b-4c8a-897e-70f5038c4d1c","Type":"ContainerDied","Data":"82061684d6f9e1c37eebe45bd7abebf630205155f5c556b3c75d1a7cef7579db"} Apr 17 17:27:53.540348 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.540070 2573 scope.go:117] "RemoveContainer" containerID="c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d" Apr 17 17:27:53.547346 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.547328 2573 scope.go:117] "RemoveContainer" containerID="c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d" Apr 17 17:27:53.547594 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:27:53.547572 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d\": container with ID starting with c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d not found: ID does not exist" containerID="c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d" Apr 17 17:27:53.547689 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.547599 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d"} err="failed to get container status \"c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d\": rpc error: code = NotFound desc = could not find container \"c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d\": container with ID starting with c0b8822aa986561171c8b51ef6bb319f2b6ff5a105acfe723041db0b5e23de5d not found: ID does not exist" Apr 17 17:27:53.557209 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.557180 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f459bb64b-klx5x"] Apr 17 17:27:53.560726 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:53.560705 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f459bb64b-klx5x"] Apr 17 17:27:54.924479 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:54.924447 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" path="/var/lib/kubelet/pods/0cb594bc-3f1b-4c8a-897e-70f5038c4d1c/volumes" Apr 17 17:27:58.895396 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.895361 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff"] Apr 17 17:27:58.895764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.895645 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" containerName="console" Apr 17 17:27:58.895764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.895657 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" containerName="console" Apr 17 17:27:58.895764 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.895706 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cb594bc-3f1b-4c8a-897e-70f5038c4d1c" containerName="console" Apr 17 17:27:58.900023 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.900003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:58.902447 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.902405 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:27:58.902447 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.902405 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-vdp24\"" Apr 17 17:27:58.903334 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.903321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:27:58.906885 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.906864 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff"] Apr 17 17:27:58.944644 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.944612 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:58.944817 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.944689 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72dkf\" (UniqueName: \"kubernetes.io/projected/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-kube-api-access-72dkf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:58.944817 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:58.944747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:59.046080 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.046040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:59.046229 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.046089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:59.046229 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.046150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72dkf\" (UniqueName: \"kubernetes.io/projected/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-kube-api-access-72dkf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:59.046493 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.046413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:59.046544 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.046504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:59.054891 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.054859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72dkf\" (UniqueName: \"kubernetes.io/projected/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-kube-api-access-72dkf\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:59.209541 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.209447 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:27:59.331567 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.331487 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff"] Apr 17 17:27:59.334591 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:27:59.334559 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod516fb040_3fbf_481f_ae3d_c8ae6c5dcd42.slice/crio-9f21e312f693a5b55c8a5c73490daeb981b887926ee7011c511627446d494b49 WatchSource:0}: Error finding container 9f21e312f693a5b55c8a5c73490daeb981b887926ee7011c511627446d494b49: Status 404 returned error can't find the container with id 9f21e312f693a5b55c8a5c73490daeb981b887926ee7011c511627446d494b49 Apr 17 17:27:59.556566 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:27:59.556486 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" event={"ID":"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42","Type":"ContainerStarted","Data":"9f21e312f693a5b55c8a5c73490daeb981b887926ee7011c511627446d494b49"} Apr 17 17:28:07.579670 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:07.579630 2573 generic.go:358] "Generic (PLEG): container finished" podID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerID="a4327a0da6b0daf6d1977cb2e52575ef8b1d78ca8d535382cae8ae0d08bf3a18" exitCode=0 Apr 17 17:28:07.580043 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:07.579677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" event={"ID":"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42","Type":"ContainerDied","Data":"a4327a0da6b0daf6d1977cb2e52575ef8b1d78ca8d535382cae8ae0d08bf3a18"} Apr 17 17:28:09.587650 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:09.587616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" event={"ID":"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42","Type":"ContainerStarted","Data":"cb73e8e07d7586d4107113e15f0f682883127fb804120fb1a2247a618b1a970f"} Apr 17 17:28:10.591910 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:10.591872 2573 generic.go:358] "Generic (PLEG): container finished" podID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerID="cb73e8e07d7586d4107113e15f0f682883127fb804120fb1a2247a618b1a970f" exitCode=0 Apr 17 17:28:10.592371 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:10.591956 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" event={"ID":"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42","Type":"ContainerDied","Data":"cb73e8e07d7586d4107113e15f0f682883127fb804120fb1a2247a618b1a970f"} Apr 17 17:28:21.626838 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:21.626806 2573 generic.go:358] "Generic (PLEG): container finished" podID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerID="00279cfe28a951e98932365d4e78a81e57880dfef82c0b017b017d08a93edd84" exitCode=0 Apr 17 17:28:21.627237 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:21.626843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" event={"ID":"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42","Type":"ContainerDied","Data":"00279cfe28a951e98932365d4e78a81e57880dfef82c0b017b017d08a93edd84"} Apr 17 17:28:22.744943 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.744911 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:28:22.838644 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.838609 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72dkf\" (UniqueName: \"kubernetes.io/projected/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-kube-api-access-72dkf\") pod \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " Apr 17 17:28:22.838832 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.838672 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-bundle\") pod \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " Apr 17 17:28:22.838832 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.838715 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-util\") pod \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\" (UID: \"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42\") " Apr 17 17:28:22.839275 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.839249 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-bundle" (OuterVolumeSpecName: "bundle") pod "516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" (UID: "516fb040-3fbf-481f-ae3d-c8ae6c5dcd42"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:22.840917 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.840892 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-kube-api-access-72dkf" (OuterVolumeSpecName: "kube-api-access-72dkf") pod "516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" (UID: "516fb040-3fbf-481f-ae3d-c8ae6c5dcd42"). InnerVolumeSpecName "kube-api-access-72dkf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:22.843319 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.843287 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-util" (OuterVolumeSpecName: "util") pod "516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" (UID: "516fb040-3fbf-481f-ae3d-c8ae6c5dcd42"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:22.940226 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.940197 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72dkf\" (UniqueName: \"kubernetes.io/projected/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-kube-api-access-72dkf\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:28:22.940226 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.940223 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-bundle\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:28:22.940388 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:22.940236 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/516fb040-3fbf-481f-ae3d-c8ae6c5dcd42-util\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:28:23.632993 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:23.632959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" event={"ID":"516fb040-3fbf-481f-ae3d-c8ae6c5dcd42","Type":"ContainerDied","Data":"9f21e312f693a5b55c8a5c73490daeb981b887926ee7011c511627446d494b49"} Apr 17 17:28:23.632993 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:23.632995 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f21e312f693a5b55c8a5c73490daeb981b887926ee7011c511627446d494b49" Apr 17 17:28:23.633193 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:23.632967 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cmlzff" Apr 17 17:28:26.035139 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.035101 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj"] Apr 17 17:28:26.035513 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.035498 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerName="extract" Apr 17 17:28:26.035564 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.035517 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerName="extract" Apr 17 17:28:26.035564 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.035534 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerName="util" Apr 17 17:28:26.035564 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.035542 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerName="util" Apr 17 17:28:26.035564 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.035556 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerName="pull" Apr 17 17:28:26.035564 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.035565 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerName="pull" Apr 17 17:28:26.035724 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.035628 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="516fb040-3fbf-481f-ae3d-c8ae6c5dcd42" containerName="extract" Apr 17 17:28:26.038766 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.038750 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:26.041503 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.041477 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 17:28:26.041503 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.041494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 17:28:26.041661 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.041589 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 17:28:26.041661 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.041600 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-kf64r\"" Apr 17 17:28:26.053759 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.053738 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj"] Apr 17 17:28:26.163863 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.163827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/885bb386-31b1-4f38-a16a-d30bf03c444b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj\" (UID: \"885bb386-31b1-4f38-a16a-d30bf03c444b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:26.164030 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.163899 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgn6z\" (UniqueName: \"kubernetes.io/projected/885bb386-31b1-4f38-a16a-d30bf03c444b-kube-api-access-wgn6z\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj\" (UID: \"885bb386-31b1-4f38-a16a-d30bf03c444b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:26.264470 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.264408 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgn6z\" (UniqueName: \"kubernetes.io/projected/885bb386-31b1-4f38-a16a-d30bf03c444b-kube-api-access-wgn6z\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj\" (UID: \"885bb386-31b1-4f38-a16a-d30bf03c444b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:26.264675 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.264492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/885bb386-31b1-4f38-a16a-d30bf03c444b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj\" (UID: \"885bb386-31b1-4f38-a16a-d30bf03c444b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:26.266837 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.266810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/885bb386-31b1-4f38-a16a-d30bf03c444b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj\" (UID: \"885bb386-31b1-4f38-a16a-d30bf03c444b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:26.281256 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.281229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgn6z\" (UniqueName: \"kubernetes.io/projected/885bb386-31b1-4f38-a16a-d30bf03c444b-kube-api-access-wgn6z\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj\" (UID: \"885bb386-31b1-4f38-a16a-d30bf03c444b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:26.348872 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.348774 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:26.494377 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.494348 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj"] Apr 17 17:28:26.496810 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:28:26.496787 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod885bb386_31b1_4f38_a16a_d30bf03c444b.slice/crio-651d3afcd9f3db71fb85501e4f0c81af32b5343eae4454078519d836625de601 WatchSource:0}: Error finding container 651d3afcd9f3db71fb85501e4f0c81af32b5343eae4454078519d836625de601: Status 404 returned error can't find the container with id 651d3afcd9f3db71fb85501e4f0c81af32b5343eae4454078519d836625de601 Apr 17 17:28:26.642554 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:26.642521 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" event={"ID":"885bb386-31b1-4f38-a16a-d30bf03c444b","Type":"ContainerStarted","Data":"651d3afcd9f3db71fb85501e4f0c81af32b5343eae4454078519d836625de601"} Apr 17 17:28:30.655302 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.655260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" event={"ID":"885bb386-31b1-4f38-a16a-d30bf03c444b","Type":"ContainerStarted","Data":"09d7b99fffb18062139e63387021ec64fb3ad991ac2fb73a7265e66247f40bf3"} Apr 17 17:28:30.655680 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.655330 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:30.723787 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.723734 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" podStartSLOduration=1.00491779 podStartE2EDuration="4.72371649s" podCreationTimestamp="2026-04-17 17:28:26 +0000 UTC" firstStartedPulling="2026-04-17 17:28:26.498408658 +0000 UTC m=+202.166417204" lastFinishedPulling="2026-04-17 17:28:30.217207354 +0000 UTC m=+205.885215904" observedRunningTime="2026-04-17 17:28:30.72223076 +0000 UTC m=+206.390239326" watchObservedRunningTime="2026-04-17 17:28:30.72371649 +0000 UTC m=+206.391725055" Apr 17 17:28:30.825183 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.825146 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2wzns"] Apr 17 17:28:30.828391 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.828375 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:30.831530 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.831508 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 17:28:30.831646 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.831527 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 17:28:30.831905 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.831890 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-4rzzn\"" Apr 17 17:28:30.838001 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.837971 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2wzns"] Apr 17 17:28:30.905540 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.905455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:30.905540 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.905493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b999af1b-c5ea-449e-9b72-c09356f58d72-cabundle0\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:30.905540 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:30.905523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ppnf\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-kube-api-access-9ppnf\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:31.006782 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.006740 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ppnf\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-kube-api-access-9ppnf\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:31.006970 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.006807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:31.006970 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.006828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b999af1b-c5ea-449e-9b72-c09356f58d72-cabundle0\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:31.006970 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.006914 2573 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 17 17:28:31.006970 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.006938 2573 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:28:31.006970 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.006948 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:28:31.006970 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.006962 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2wzns: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 17:28:31.007274 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.007041 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates podName:b999af1b-c5ea-449e-9b72-c09356f58d72 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:31.507006661 +0000 UTC m=+207.175015226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates") pod "keda-operator-ffbb595cb-2wzns" (UID: "b999af1b-c5ea-449e-9b72-c09356f58d72") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 17:28:31.007348 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.007331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/b999af1b-c5ea-449e-9b72-c09356f58d72-cabundle0\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:31.017901 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.017880 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ppnf\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-kube-api-access-9ppnf\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:31.092646 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.092615 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh"] Apr 17 17:28:31.095869 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.095846 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.098240 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.098218 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 17:28:31.104996 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.104961 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh"] Apr 17 17:28:31.208844 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.208752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d77j\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-kube-api-access-7d77j\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.208844 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.208791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8add8df1-56f9-422a-812b-efe372daa911-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.209070 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.208954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.309876 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.309826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.310043 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.309979 2573 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:28:31.310043 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.310002 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:28:31.310043 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.310024 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh: references non-existent secret key: tls.crt Apr 17 17:28:31.310153 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.310078 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates podName:8add8df1-56f9-422a-812b-efe372daa911 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:31.810061274 +0000 UTC m=+207.478069819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates") pod "keda-metrics-apiserver-7c9f485588-7ngqh" (UID: "8add8df1-56f9-422a-812b-efe372daa911") : references non-existent secret key: tls.crt Apr 17 17:28:31.310153 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.309983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d77j\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-kube-api-access-7d77j\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.310240 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.310162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8add8df1-56f9-422a-812b-efe372daa911-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.310515 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.310500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/8add8df1-56f9-422a-812b-efe372daa911-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.316292 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.316267 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-sv96p"] Apr 17 17:28:31.319593 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.319574 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:31.321761 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.321739 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 17:28:31.323804 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.323782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d77j\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-kube-api-access-7d77j\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.328084 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.328064 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-sv96p"] Apr 17 17:28:31.410801 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.410756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv6sk\" (UniqueName: \"kubernetes.io/projected/85baf2d9-86b8-4de4-9f7c-012ffcb590b8-kube-api-access-tv6sk\") pod \"keda-admission-cf49989db-sv96p\" (UID: \"85baf2d9-86b8-4de4-9f7c-012ffcb590b8\") " pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:31.410801 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.410807 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/85baf2d9-86b8-4de4-9f7c-012ffcb590b8-certificates\") pod \"keda-admission-cf49989db-sv96p\" (UID: \"85baf2d9-86b8-4de4-9f7c-012ffcb590b8\") " pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:31.511730 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.511635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv6sk\" (UniqueName: \"kubernetes.io/projected/85baf2d9-86b8-4de4-9f7c-012ffcb590b8-kube-api-access-tv6sk\") pod \"keda-admission-cf49989db-sv96p\" (UID: \"85baf2d9-86b8-4de4-9f7c-012ffcb590b8\") " pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:31.511730 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.511676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/85baf2d9-86b8-4de4-9f7c-012ffcb590b8-certificates\") pod \"keda-admission-cf49989db-sv96p\" (UID: \"85baf2d9-86b8-4de4-9f7c-012ffcb590b8\") " pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:31.511730 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.511720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:31.512012 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.511814 2573 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:28:31.512012 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.511825 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:28:31.512012 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.511834 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2wzns: references non-existent secret key: ca.crt Apr 17 17:28:31.512012 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.511901 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates podName:b999af1b-c5ea-449e-9b72-c09356f58d72 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:32.51188659 +0000 UTC m=+208.179895135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates") pod "keda-operator-ffbb595cb-2wzns" (UID: "b999af1b-c5ea-449e-9b72-c09356f58d72") : references non-existent secret key: ca.crt Apr 17 17:28:31.515741 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.514409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/85baf2d9-86b8-4de4-9f7c-012ffcb590b8-certificates\") pod \"keda-admission-cf49989db-sv96p\" (UID: \"85baf2d9-86b8-4de4-9f7c-012ffcb590b8\") " pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:31.524332 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.524307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv6sk\" (UniqueName: \"kubernetes.io/projected/85baf2d9-86b8-4de4-9f7c-012ffcb590b8-kube-api-access-tv6sk\") pod \"keda-admission-cf49989db-sv96p\" (UID: \"85baf2d9-86b8-4de4-9f7c-012ffcb590b8\") " pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:31.636547 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.636504 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:31.782271 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.782243 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-sv96p"] Apr 17 17:28:31.784948 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:28:31.784922 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85baf2d9_86b8_4de4_9f7c_012ffcb590b8.slice/crio-e8d2ba3357f61b571b7d35c97b7d3c338cfad28c668f5d0a2a004d91dcd4abef WatchSource:0}: Error finding container e8d2ba3357f61b571b7d35c97b7d3c338cfad28c668f5d0a2a004d91dcd4abef: Status 404 returned error can't find the container with id e8d2ba3357f61b571b7d35c97b7d3c338cfad28c668f5d0a2a004d91dcd4abef Apr 17 17:28:31.813702 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:31.813661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:31.813868 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.813844 2573 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:28:31.813868 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.813863 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:28:31.813944 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.813882 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh: references non-existent secret key: tls.crt Apr 17 17:28:31.813944 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:31.813931 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates podName:8add8df1-56f9-422a-812b-efe372daa911 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:32.813917243 +0000 UTC m=+208.481925787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates") pod "keda-metrics-apiserver-7c9f485588-7ngqh" (UID: "8add8df1-56f9-422a-812b-efe372daa911") : references non-existent secret key: tls.crt Apr 17 17:28:32.519331 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:32.519290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:32.519517 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:32.519445 2573 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:28:32.519517 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:32.519464 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:28:32.519517 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:32.519473 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2wzns: references non-existent secret key: ca.crt Apr 17 17:28:32.519613 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:32.519526 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates podName:b999af1b-c5ea-449e-9b72-c09356f58d72 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:34.519511968 +0000 UTC m=+210.187520512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates") pod "keda-operator-ffbb595cb-2wzns" (UID: "b999af1b-c5ea-449e-9b72-c09356f58d72") : references non-existent secret key: ca.crt Apr 17 17:28:32.665056 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:32.665019 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-sv96p" event={"ID":"85baf2d9-86b8-4de4-9f7c-012ffcb590b8","Type":"ContainerStarted","Data":"e8d2ba3357f61b571b7d35c97b7d3c338cfad28c668f5d0a2a004d91dcd4abef"} Apr 17 17:28:32.822551 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:32.822466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:32.822920 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:32.822581 2573 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:28:32.822920 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:32.822593 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:28:32.822920 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:32.822611 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh: references non-existent secret key: tls.crt Apr 17 17:28:32.822920 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:32.822658 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates podName:8add8df1-56f9-422a-812b-efe372daa911 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:34.822645727 +0000 UTC m=+210.490654273 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates") pod "keda-metrics-apiserver-7c9f485588-7ngqh" (UID: "8add8df1-56f9-422a-812b-efe372daa911") : references non-existent secret key: tls.crt Apr 17 17:28:33.668490 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:33.668458 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-sv96p" event={"ID":"85baf2d9-86b8-4de4-9f7c-012ffcb590b8","Type":"ContainerStarted","Data":"9959b806ca2861f158e7c06a5d78a3660eca114f3e9798dc5c51f63993192101"} Apr 17 17:28:33.668642 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:33.668508 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:33.688313 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:33.688260 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-sv96p" podStartSLOduration=1.13866902 podStartE2EDuration="2.688244732s" podCreationTimestamp="2026-04-17 17:28:31 +0000 UTC" firstStartedPulling="2026-04-17 17:28:31.786257844 +0000 UTC m=+207.454266390" lastFinishedPulling="2026-04-17 17:28:33.335833557 +0000 UTC m=+209.003842102" observedRunningTime="2026-04-17 17:28:33.685937635 +0000 UTC m=+209.353946201" watchObservedRunningTime="2026-04-17 17:28:33.688244732 +0000 UTC m=+209.356253294" Apr 17 17:28:34.535811 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:34.535751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:34.536218 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:34.535915 2573 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:28:34.536218 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:34.535929 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:28:34.536218 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:34.535938 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-2wzns: references non-existent secret key: ca.crt Apr 17 17:28:34.536218 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:34.535993 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates podName:b999af1b-c5ea-449e-9b72-c09356f58d72 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:38.535975173 +0000 UTC m=+214.203983735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates") pod "keda-operator-ffbb595cb-2wzns" (UID: "b999af1b-c5ea-449e-9b72-c09356f58d72") : references non-existent secret key: ca.crt Apr 17 17:28:34.838264 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:34.838155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:34.838445 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:34.838294 2573 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:28:34.838445 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:34.838313 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:28:34.838445 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:34.838333 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh: references non-existent secret key: tls.crt Apr 17 17:28:34.838445 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:28:34.838406 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates podName:8add8df1-56f9-422a-812b-efe372daa911 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:38.838390815 +0000 UTC m=+214.506399360 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates") pod "keda-metrics-apiserver-7c9f485588-7ngqh" (UID: "8add8df1-56f9-422a-812b-efe372daa911") : references non-existent secret key: tls.crt Apr 17 17:28:38.568628 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:38.568586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:38.570999 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:38.570975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b999af1b-c5ea-449e-9b72-c09356f58d72-certificates\") pod \"keda-operator-ffbb595cb-2wzns\" (UID: \"b999af1b-c5ea-449e-9b72-c09356f58d72\") " pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:38.638506 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:38.638460 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:38.760410 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:38.760244 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-2wzns"] Apr 17 17:28:38.763315 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:28:38.763271 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb999af1b_c5ea_449e_9b72_c09356f58d72.slice/crio-a43c63ebb1bd2c0fcbcc46da6aec3d35ca3002992b58b71ad122a469a081143f WatchSource:0}: Error finding container a43c63ebb1bd2c0fcbcc46da6aec3d35ca3002992b58b71ad122a469a081143f: Status 404 returned error can't find the container with id a43c63ebb1bd2c0fcbcc46da6aec3d35ca3002992b58b71ad122a469a081143f Apr 17 17:28:38.870809 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:38.870728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:38.884657 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:38.884627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/8add8df1-56f9-422a-812b-efe372daa911-certificates\") pod \"keda-metrics-apiserver-7c9f485588-7ngqh\" (UID: \"8add8df1-56f9-422a-812b-efe372daa911\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:38.906058 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:38.906023 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:39.036228 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:39.036202 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh"] Apr 17 17:28:39.038339 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:28:39.038309 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8add8df1_56f9_422a_812b_efe372daa911.slice/crio-95a9f34d599479c9534d5d9cd3b2cbd39c250ae796a56c0eac7b0d37023d67d9 WatchSource:0}: Error finding container 95a9f34d599479c9534d5d9cd3b2cbd39c250ae796a56c0eac7b0d37023d67d9: Status 404 returned error can't find the container with id 95a9f34d599479c9534d5d9cd3b2cbd39c250ae796a56c0eac7b0d37023d67d9 Apr 17 17:28:39.685907 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:39.685850 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" event={"ID":"8add8df1-56f9-422a-812b-efe372daa911","Type":"ContainerStarted","Data":"95a9f34d599479c9534d5d9cd3b2cbd39c250ae796a56c0eac7b0d37023d67d9"} Apr 17 17:28:39.686964 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:39.686938 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2wzns" event={"ID":"b999af1b-c5ea-449e-9b72-c09356f58d72","Type":"ContainerStarted","Data":"a43c63ebb1bd2c0fcbcc46da6aec3d35ca3002992b58b71ad122a469a081143f"} Apr 17 17:28:42.698713 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:42.698673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-2wzns" event={"ID":"b999af1b-c5ea-449e-9b72-c09356f58d72","Type":"ContainerStarted","Data":"60661bf3629e4c372fd74d49b1531772e2ff415a76c72557f882be86791f2246"} Apr 17 17:28:42.699173 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:42.698794 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:28:42.716151 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:42.716094 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-2wzns" podStartSLOduration=9.673602199 podStartE2EDuration="12.716074627s" podCreationTimestamp="2026-04-17 17:28:30 +0000 UTC" firstStartedPulling="2026-04-17 17:28:38.764359182 +0000 UTC m=+214.432367726" lastFinishedPulling="2026-04-17 17:28:41.806831606 +0000 UTC m=+217.474840154" observedRunningTime="2026-04-17 17:28:42.715185426 +0000 UTC m=+218.383194010" watchObservedRunningTime="2026-04-17 17:28:42.716074627 +0000 UTC m=+218.384083190" Apr 17 17:28:43.703599 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:43.703561 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" event={"ID":"8add8df1-56f9-422a-812b-efe372daa911","Type":"ContainerStarted","Data":"5b239af085d1221cdfc8ae386bfb70b5193dfc9062b680b87bee96be046e48d1"} Apr 17 17:28:43.723243 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:43.723191 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" podStartSLOduration=8.628485708 podStartE2EDuration="12.723174578s" podCreationTimestamp="2026-04-17 17:28:31 +0000 UTC" firstStartedPulling="2026-04-17 17:28:39.03960808 +0000 UTC m=+214.707616625" lastFinishedPulling="2026-04-17 17:28:43.13429695 +0000 UTC m=+218.802305495" observedRunningTime="2026-04-17 17:28:43.721924528 +0000 UTC m=+219.389933096" watchObservedRunningTime="2026-04-17 17:28:43.723174578 +0000 UTC m=+219.391183145" Apr 17 17:28:44.706091 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:44.706056 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:28:51.662848 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:51.662813 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-kzmtj" Apr 17 17:28:54.674465 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:54.674419 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-sv96p" Apr 17 17:28:55.712570 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:28:55.712539 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-7ngqh" Apr 17 17:29:03.706344 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:03.706311 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-2wzns" Apr 17 17:29:38.975317 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.975284 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mdtsg"] Apr 17 17:29:38.978401 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.978384 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:38.981174 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.981153 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 17:29:38.981174 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.981164 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:29:38.981349 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.981161 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp"] Apr 17 17:29:38.981349 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.981218 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-fwkvc\"" Apr 17 17:29:38.982045 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.982027 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:29:38.984132 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.984118 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:38.986410 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.986393 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 17:29:38.986526 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.986513 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-ff9m9\"" Apr 17 17:29:38.989170 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.989152 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mdtsg"] Apr 17 17:29:38.992745 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:38.992725 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp"] Apr 17 17:29:39.021959 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.021922 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-jbd4x"] Apr 17 17:29:39.025118 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.025097 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:39.027705 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.027679 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 17:29:39.027927 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.027907 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bv46s\"" Apr 17 17:29:39.036017 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.035999 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-jbd4x"] Apr 17 17:29:39.041866 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.041844 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pmmc\" (UniqueName: \"kubernetes.io/projected/5124fcc8-c743-49e0-8f45-b25f418a1a10-kube-api-access-9pmmc\") pod \"kserve-controller-manager-85dd7cfb4d-mdtsg\" (UID: \"5124fcc8-c743-49e0-8f45-b25f418a1a10\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:39.041956 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.041920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5124fcc8-c743-49e0-8f45-b25f418a1a10-cert\") pod \"kserve-controller-manager-85dd7cfb4d-mdtsg\" (UID: \"5124fcc8-c743-49e0-8f45-b25f418a1a10\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:39.142776 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.142741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68hk\" (UniqueName: \"kubernetes.io/projected/ce68936e-3393-43fa-857a-68e9825e5fa1-kube-api-access-d68hk\") pod \"seaweedfs-86cc847c5c-jbd4x\" (UID: \"ce68936e-3393-43fa-857a-68e9825e5fa1\") " pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:39.142960 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.142787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pmmc\" (UniqueName: \"kubernetes.io/projected/5124fcc8-c743-49e0-8f45-b25f418a1a10-kube-api-access-9pmmc\") pod \"kserve-controller-manager-85dd7cfb4d-mdtsg\" (UID: \"5124fcc8-c743-49e0-8f45-b25f418a1a10\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:39.142960 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.142815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c477d6d-2b0b-4d77-a660-533d36daf2ee-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8rdbp\" (UID: \"6c477d6d-2b0b-4d77-a660-533d36daf2ee\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:39.142960 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.142843 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rksxc\" (UniqueName: \"kubernetes.io/projected/6c477d6d-2b0b-4d77-a660-533d36daf2ee-kube-api-access-rksxc\") pod \"llmisvc-controller-manager-68cc5db7c4-8rdbp\" (UID: \"6c477d6d-2b0b-4d77-a660-533d36daf2ee\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:39.142960 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.142913 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5124fcc8-c743-49e0-8f45-b25f418a1a10-cert\") pod \"kserve-controller-manager-85dd7cfb4d-mdtsg\" (UID: \"5124fcc8-c743-49e0-8f45-b25f418a1a10\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:39.143122 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.142966 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ce68936e-3393-43fa-857a-68e9825e5fa1-data\") pod \"seaweedfs-86cc847c5c-jbd4x\" (UID: \"ce68936e-3393-43fa-857a-68e9825e5fa1\") " pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:39.145144 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.145124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5124fcc8-c743-49e0-8f45-b25f418a1a10-cert\") pod \"kserve-controller-manager-85dd7cfb4d-mdtsg\" (UID: \"5124fcc8-c743-49e0-8f45-b25f418a1a10\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:39.152206 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.152187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pmmc\" (UniqueName: \"kubernetes.io/projected/5124fcc8-c743-49e0-8f45-b25f418a1a10-kube-api-access-9pmmc\") pod \"kserve-controller-manager-85dd7cfb4d-mdtsg\" (UID: \"5124fcc8-c743-49e0-8f45-b25f418a1a10\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:39.244223 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.244133 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c477d6d-2b0b-4d77-a660-533d36daf2ee-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8rdbp\" (UID: \"6c477d6d-2b0b-4d77-a660-533d36daf2ee\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:39.244223 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.244171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rksxc\" (UniqueName: \"kubernetes.io/projected/6c477d6d-2b0b-4d77-a660-533d36daf2ee-kube-api-access-rksxc\") pod \"llmisvc-controller-manager-68cc5db7c4-8rdbp\" (UID: \"6c477d6d-2b0b-4d77-a660-533d36daf2ee\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:39.244223 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.244220 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ce68936e-3393-43fa-857a-68e9825e5fa1-data\") pod \"seaweedfs-86cc847c5c-jbd4x\" (UID: \"ce68936e-3393-43fa-857a-68e9825e5fa1\") " pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:39.244540 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.244241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d68hk\" (UniqueName: \"kubernetes.io/projected/ce68936e-3393-43fa-857a-68e9825e5fa1-kube-api-access-d68hk\") pod \"seaweedfs-86cc847c5c-jbd4x\" (UID: \"ce68936e-3393-43fa-857a-68e9825e5fa1\") " pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:39.244597 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.244582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ce68936e-3393-43fa-857a-68e9825e5fa1-data\") pod \"seaweedfs-86cc847c5c-jbd4x\" (UID: \"ce68936e-3393-43fa-857a-68e9825e5fa1\") " pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:39.246554 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.246530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6c477d6d-2b0b-4d77-a660-533d36daf2ee-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8rdbp\" (UID: \"6c477d6d-2b0b-4d77-a660-533d36daf2ee\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:39.257125 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.257103 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rksxc\" (UniqueName: \"kubernetes.io/projected/6c477d6d-2b0b-4d77-a660-533d36daf2ee-kube-api-access-rksxc\") pod \"llmisvc-controller-manager-68cc5db7c4-8rdbp\" (UID: \"6c477d6d-2b0b-4d77-a660-533d36daf2ee\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:39.257242 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.257188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68hk\" (UniqueName: \"kubernetes.io/projected/ce68936e-3393-43fa-857a-68e9825e5fa1-kube-api-access-d68hk\") pod \"seaweedfs-86cc847c5c-jbd4x\" (UID: \"ce68936e-3393-43fa-857a-68e9825e5fa1\") " pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:39.289361 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.289324 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:39.296267 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.296240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:39.335038 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.334982 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:39.429233 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.429202 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mdtsg"] Apr 17 17:29:39.433628 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:29:39.433599 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5124fcc8_c743_49e0_8f45_b25f418a1a10.slice/crio-97f8802ff11b8d89e4ed003cb65f0a2266110decfd636dccdd18249556379f2a WatchSource:0}: Error finding container 97f8802ff11b8d89e4ed003cb65f0a2266110decfd636dccdd18249556379f2a: Status 404 returned error can't find the container with id 97f8802ff11b8d89e4ed003cb65f0a2266110decfd636dccdd18249556379f2a Apr 17 17:29:39.452120 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.452090 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp"] Apr 17 17:29:39.456102 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:29:39.456077 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6c477d6d_2b0b_4d77_a660_533d36daf2ee.slice/crio-fa30b33c9d11c9e0cfed325c74150ce24edabdd5d2ebdc77e8defbcb7eca5f65 WatchSource:0}: Error finding container fa30b33c9d11c9e0cfed325c74150ce24edabdd5d2ebdc77e8defbcb7eca5f65: Status 404 returned error can't find the container with id fa30b33c9d11c9e0cfed325c74150ce24edabdd5d2ebdc77e8defbcb7eca5f65 Apr 17 17:29:39.477177 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.477156 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-jbd4x"] Apr 17 17:29:39.479333 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:29:39.479302 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce68936e_3393_43fa_857a_68e9825e5fa1.slice/crio-727bd2009c645ee9f76ede6eaac4c67fadcde114f94f357e9706f0c8f9c939cf WatchSource:0}: Error finding container 727bd2009c645ee9f76ede6eaac4c67fadcde114f94f357e9706f0c8f9c939cf: Status 404 returned error can't find the container with id 727bd2009c645ee9f76ede6eaac4c67fadcde114f94f357e9706f0c8f9c939cf Apr 17 17:29:39.855216 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.855178 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" event={"ID":"5124fcc8-c743-49e0-8f45-b25f418a1a10","Type":"ContainerStarted","Data":"97f8802ff11b8d89e4ed003cb65f0a2266110decfd636dccdd18249556379f2a"} Apr 17 17:29:39.856403 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.856371 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" event={"ID":"6c477d6d-2b0b-4d77-a660-533d36daf2ee","Type":"ContainerStarted","Data":"fa30b33c9d11c9e0cfed325c74150ce24edabdd5d2ebdc77e8defbcb7eca5f65"} Apr 17 17:29:39.857470 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:39.857441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-jbd4x" event={"ID":"ce68936e-3393-43fa-857a-68e9825e5fa1","Type":"ContainerStarted","Data":"727bd2009c645ee9f76ede6eaac4c67fadcde114f94f357e9706f0c8f9c939cf"} Apr 17 17:29:44.875937 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.875901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" event={"ID":"6c477d6d-2b0b-4d77-a660-533d36daf2ee","Type":"ContainerStarted","Data":"0c3421813891ea6b67fca19247e7cb669febcae7d5b67a7fab3ad070dfff1731"} Apr 17 17:29:44.876339 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.875999 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:29:44.877274 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.877249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-jbd4x" event={"ID":"ce68936e-3393-43fa-857a-68e9825e5fa1","Type":"ContainerStarted","Data":"2929cc4cd824b40a0d29ab944009d7241dab52d480d6a5a438095eb705da14ee"} Apr 17 17:29:44.877382 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.877299 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:29:44.878501 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.878482 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" event={"ID":"5124fcc8-c743-49e0-8f45-b25f418a1a10","Type":"ContainerStarted","Data":"2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007"} Apr 17 17:29:44.878585 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.878575 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:29:44.905354 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.905307 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" podStartSLOduration=2.431088783 podStartE2EDuration="6.905293327s" podCreationTimestamp="2026-04-17 17:29:38 +0000 UTC" firstStartedPulling="2026-04-17 17:29:39.457521819 +0000 UTC m=+275.125530363" lastFinishedPulling="2026-04-17 17:29:43.931726345 +0000 UTC m=+279.599734907" observedRunningTime="2026-04-17 17:29:44.903342762 +0000 UTC m=+280.571351327" watchObservedRunningTime="2026-04-17 17:29:44.905293327 +0000 UTC m=+280.573301892" Apr 17 17:29:44.921042 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.920997 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-jbd4x" podStartSLOduration=2.414346764 podStartE2EDuration="6.920985031s" podCreationTimestamp="2026-04-17 17:29:38 +0000 UTC" firstStartedPulling="2026-04-17 17:29:39.480548245 +0000 UTC m=+275.148556790" lastFinishedPulling="2026-04-17 17:29:43.98718651 +0000 UTC m=+279.655195057" observedRunningTime="2026-04-17 17:29:44.920401838 +0000 UTC m=+280.588410403" watchObservedRunningTime="2026-04-17 17:29:44.920985031 +0000 UTC m=+280.588993597" Apr 17 17:29:44.945311 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:44.945259 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" podStartSLOduration=2.5310410020000003 podStartE2EDuration="6.94524528s" podCreationTimestamp="2026-04-17 17:29:38 +0000 UTC" firstStartedPulling="2026-04-17 17:29:39.435739592 +0000 UTC m=+275.103748136" lastFinishedPulling="2026-04-17 17:29:43.849943859 +0000 UTC m=+279.517952414" observedRunningTime="2026-04-17 17:29:44.94444826 +0000 UTC m=+280.612456823" watchObservedRunningTime="2026-04-17 17:29:44.94524528 +0000 UTC m=+280.613253846" Apr 17 17:29:50.884069 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:29:50.884035 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-jbd4x" Apr 17 17:30:04.806093 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:04.806060 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:30:04.806609 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:04.806239 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:30:04.811111 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:04.811091 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:30:15.884166 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:15.884134 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8rdbp" Apr 17 17:30:15.887049 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:15.887027 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:30:17.114219 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.114178 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mdtsg"] Apr 17 17:30:17.114654 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.114415 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" podUID="5124fcc8-c743-49e0-8f45-b25f418a1a10" containerName="manager" containerID="cri-o://2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007" gracePeriod=10 Apr 17 17:30:17.153762 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.153726 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-vmdq7"] Apr 17 17:30:17.155923 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.155907 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:17.186461 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.186418 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-vmdq7"] Apr 17 17:30:17.234228 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.234200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brnxp\" (UniqueName: \"kubernetes.io/projected/f35f105b-7673-47da-a818-6b7fbb4aaedd-kube-api-access-brnxp\") pod \"kserve-controller-manager-85dd7cfb4d-vmdq7\" (UID: \"f35f105b-7673-47da-a818-6b7fbb4aaedd\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:17.234339 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.234257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f35f105b-7673-47da-a818-6b7fbb4aaedd-cert\") pod \"kserve-controller-manager-85dd7cfb4d-vmdq7\" (UID: \"f35f105b-7673-47da-a818-6b7fbb4aaedd\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:17.334763 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.334729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brnxp\" (UniqueName: \"kubernetes.io/projected/f35f105b-7673-47da-a818-6b7fbb4aaedd-kube-api-access-brnxp\") pod \"kserve-controller-manager-85dd7cfb4d-vmdq7\" (UID: \"f35f105b-7673-47da-a818-6b7fbb4aaedd\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:17.334922 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.334783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f35f105b-7673-47da-a818-6b7fbb4aaedd-cert\") pod \"kserve-controller-manager-85dd7cfb4d-vmdq7\" (UID: \"f35f105b-7673-47da-a818-6b7fbb4aaedd\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:17.337172 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.337145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f35f105b-7673-47da-a818-6b7fbb4aaedd-cert\") pod \"kserve-controller-manager-85dd7cfb4d-vmdq7\" (UID: \"f35f105b-7673-47da-a818-6b7fbb4aaedd\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:17.348913 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.348889 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brnxp\" (UniqueName: \"kubernetes.io/projected/f35f105b-7673-47da-a818-6b7fbb4aaedd-kube-api-access-brnxp\") pod \"kserve-controller-manager-85dd7cfb4d-vmdq7\" (UID: \"f35f105b-7673-47da-a818-6b7fbb4aaedd\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:17.360236 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.360218 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:30:17.435266 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.435235 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5124fcc8-c743-49e0-8f45-b25f418a1a10-cert\") pod \"5124fcc8-c743-49e0-8f45-b25f418a1a10\" (UID: \"5124fcc8-c743-49e0-8f45-b25f418a1a10\") " Apr 17 17:30:17.437202 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.437176 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5124fcc8-c743-49e0-8f45-b25f418a1a10-cert" (OuterVolumeSpecName: "cert") pod "5124fcc8-c743-49e0-8f45-b25f418a1a10" (UID: "5124fcc8-c743-49e0-8f45-b25f418a1a10"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:30:17.497165 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.497120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:17.536265 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.536235 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pmmc\" (UniqueName: \"kubernetes.io/projected/5124fcc8-c743-49e0-8f45-b25f418a1a10-kube-api-access-9pmmc\") pod \"5124fcc8-c743-49e0-8f45-b25f418a1a10\" (UID: \"5124fcc8-c743-49e0-8f45-b25f418a1a10\") " Apr 17 17:30:17.536438 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.536413 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5124fcc8-c743-49e0-8f45-b25f418a1a10-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:30:17.538263 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.538230 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5124fcc8-c743-49e0-8f45-b25f418a1a10-kube-api-access-9pmmc" (OuterVolumeSpecName: "kube-api-access-9pmmc") pod "5124fcc8-c743-49e0-8f45-b25f418a1a10" (UID: "5124fcc8-c743-49e0-8f45-b25f418a1a10"). InnerVolumeSpecName "kube-api-access-9pmmc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:30:17.613724 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.613696 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-vmdq7"] Apr 17 17:30:17.615829 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:30:17.615807 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf35f105b_7673_47da_a818_6b7fbb4aaedd.slice/crio-ba0c4adc15b9a31558d4ceebda3fdaae7dc6a722028d401a720d79f3b5f48a94 WatchSource:0}: Error finding container ba0c4adc15b9a31558d4ceebda3fdaae7dc6a722028d401a720d79f3b5f48a94: Status 404 returned error can't find the container with id ba0c4adc15b9a31558d4ceebda3fdaae7dc6a722028d401a720d79f3b5f48a94 Apr 17 17:30:17.616872 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.616858 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:30:17.636844 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.636820 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9pmmc\" (UniqueName: \"kubernetes.io/projected/5124fcc8-c743-49e0-8f45-b25f418a1a10-kube-api-access-9pmmc\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:30:17.981100 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.981068 2573 generic.go:358] "Generic (PLEG): container finished" podID="5124fcc8-c743-49e0-8f45-b25f418a1a10" containerID="2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007" exitCode=0 Apr 17 17:30:17.981204 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.981122 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" Apr 17 17:30:17.981204 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.981152 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" event={"ID":"5124fcc8-c743-49e0-8f45-b25f418a1a10","Type":"ContainerDied","Data":"2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007"} Apr 17 17:30:17.981204 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.981192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-mdtsg" event={"ID":"5124fcc8-c743-49e0-8f45-b25f418a1a10","Type":"ContainerDied","Data":"97f8802ff11b8d89e4ed003cb65f0a2266110decfd636dccdd18249556379f2a"} Apr 17 17:30:17.981360 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.981216 2573 scope.go:117] "RemoveContainer" containerID="2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007" Apr 17 17:30:17.982714 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.982687 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" event={"ID":"f35f105b-7673-47da-a818-6b7fbb4aaedd","Type":"ContainerStarted","Data":"ba0c4adc15b9a31558d4ceebda3fdaae7dc6a722028d401a720d79f3b5f48a94"} Apr 17 17:30:17.989365 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.989285 2573 scope.go:117] "RemoveContainer" containerID="2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007" Apr 17 17:30:17.989701 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:30:17.989679 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007\": container with ID starting with 2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007 not found: ID does not exist" containerID="2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007" Apr 17 17:30:17.989819 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:17.989712 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007"} err="failed to get container status \"2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007\": rpc error: code = NotFound desc = could not find container \"2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007\": container with ID starting with 2db24a1515df05112f7543e9c6bd6375111e4c82a7f9852e92bd39358d070007 not found: ID does not exist" Apr 17 17:30:18.002684 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:18.002655 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mdtsg"] Apr 17 17:30:18.007867 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:18.007846 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mdtsg"] Apr 17 17:30:18.924635 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:18.924605 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5124fcc8-c743-49e0-8f45-b25f418a1a10" path="/var/lib/kubelet/pods/5124fcc8-c743-49e0-8f45-b25f418a1a10/volumes" Apr 17 17:30:18.987396 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:18.987347 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" event={"ID":"f35f105b-7673-47da-a818-6b7fbb4aaedd","Type":"ContainerStarted","Data":"ece4b1660ef2beb809683b9e9cc278b65a3f4ba350123d833ce867d34bd1c4b9"} Apr 17 17:30:18.987598 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:18.987410 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:19.006797 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:19.006749 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" podStartSLOduration=1.6874994380000001 podStartE2EDuration="2.006732708s" podCreationTimestamp="2026-04-17 17:30:17 +0000 UTC" firstStartedPulling="2026-04-17 17:30:17.617015971 +0000 UTC m=+313.285024515" lastFinishedPulling="2026-04-17 17:30:17.936249235 +0000 UTC m=+313.604257785" observedRunningTime="2026-04-17 17:30:19.005237052 +0000 UTC m=+314.673245628" watchObservedRunningTime="2026-04-17 17:30:19.006732708 +0000 UTC m=+314.674741274" Apr 17 17:30:49.995857 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:49.995820 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-vmdq7" Apr 17 17:30:50.865740 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.865702 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-8r4gs"] Apr 17 17:30:50.866105 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.866087 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5124fcc8-c743-49e0-8f45-b25f418a1a10" containerName="manager" Apr 17 17:30:50.866186 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.866107 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5124fcc8-c743-49e0-8f45-b25f418a1a10" containerName="manager" Apr 17 17:30:50.866235 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.866202 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5124fcc8-c743-49e0-8f45-b25f418a1a10" containerName="manager" Apr 17 17:30:50.868405 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.868384 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:50.870932 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.870914 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 17:30:50.871033 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.870919 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-b2x8p\"" Apr 17 17:30:50.873692 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.873669 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-7qlln"] Apr 17 17:30:50.876234 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.876219 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:50.878910 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.878891 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-rjtt8\"" Apr 17 17:30:50.879010 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.878928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 17:30:50.880223 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.880200 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-8r4gs"] Apr 17 17:30:50.893869 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.893840 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7qlln"] Apr 17 17:30:50.988525 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.988483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtwwk\" (UniqueName: \"kubernetes.io/projected/f44e5ad7-86d0-42d5-b866-9890164ad1e7-kube-api-access-dtwwk\") pod \"odh-model-controller-696fc77849-7qlln\" (UID: \"f44e5ad7-86d0-42d5-b866-9890164ad1e7\") " pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:50.988689 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.988531 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f44e5ad7-86d0-42d5-b866-9890164ad1e7-cert\") pod \"odh-model-controller-696fc77849-7qlln\" (UID: \"f44e5ad7-86d0-42d5-b866-9890164ad1e7\") " pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:50.988689 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.988602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6tws\" (UniqueName: \"kubernetes.io/projected/799dd967-d8a8-4828-b4de-a2e33078cece-kube-api-access-t6tws\") pod \"model-serving-api-86f7b4b499-8r4gs\" (UID: \"799dd967-d8a8-4828-b4de-a2e33078cece\") " pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:50.988689 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:50.988663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/799dd967-d8a8-4828-b4de-a2e33078cece-tls-certs\") pod \"model-serving-api-86f7b4b499-8r4gs\" (UID: \"799dd967-d8a8-4828-b4de-a2e33078cece\") " pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:51.089353 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.089317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6tws\" (UniqueName: \"kubernetes.io/projected/799dd967-d8a8-4828-b4de-a2e33078cece-kube-api-access-t6tws\") pod \"model-serving-api-86f7b4b499-8r4gs\" (UID: \"799dd967-d8a8-4828-b4de-a2e33078cece\") " pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:51.089720 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.089362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/799dd967-d8a8-4828-b4de-a2e33078cece-tls-certs\") pod \"model-serving-api-86f7b4b499-8r4gs\" (UID: \"799dd967-d8a8-4828-b4de-a2e33078cece\") " pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:51.089720 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.089440 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtwwk\" (UniqueName: \"kubernetes.io/projected/f44e5ad7-86d0-42d5-b866-9890164ad1e7-kube-api-access-dtwwk\") pod \"odh-model-controller-696fc77849-7qlln\" (UID: \"f44e5ad7-86d0-42d5-b866-9890164ad1e7\") " pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:51.089720 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.089462 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f44e5ad7-86d0-42d5-b866-9890164ad1e7-cert\") pod \"odh-model-controller-696fc77849-7qlln\" (UID: \"f44e5ad7-86d0-42d5-b866-9890164ad1e7\") " pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:51.091772 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.091745 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f44e5ad7-86d0-42d5-b866-9890164ad1e7-cert\") pod \"odh-model-controller-696fc77849-7qlln\" (UID: \"f44e5ad7-86d0-42d5-b866-9890164ad1e7\") " pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:51.091871 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.091835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/799dd967-d8a8-4828-b4de-a2e33078cece-tls-certs\") pod \"model-serving-api-86f7b4b499-8r4gs\" (UID: \"799dd967-d8a8-4828-b4de-a2e33078cece\") " pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:51.097563 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.097540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtwwk\" (UniqueName: \"kubernetes.io/projected/f44e5ad7-86d0-42d5-b866-9890164ad1e7-kube-api-access-dtwwk\") pod \"odh-model-controller-696fc77849-7qlln\" (UID: \"f44e5ad7-86d0-42d5-b866-9890164ad1e7\") " pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:51.097670 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.097598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6tws\" (UniqueName: \"kubernetes.io/projected/799dd967-d8a8-4828-b4de-a2e33078cece-kube-api-access-t6tws\") pod \"model-serving-api-86f7b4b499-8r4gs\" (UID: \"799dd967-d8a8-4828-b4de-a2e33078cece\") " pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:51.179510 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.179455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:51.190293 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.190264 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:51.314759 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.314730 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-8r4gs"] Apr 17 17:30:51.317317 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:30:51.317285 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod799dd967_d8a8_4828_b4de_a2e33078cece.slice/crio-ba0f3c21c0f0a7fef8efa225d73e043f159f08efdc566e633d300d25948e319f WatchSource:0}: Error finding container ba0f3c21c0f0a7fef8efa225d73e043f159f08efdc566e633d300d25948e319f: Status 404 returned error can't find the container with id ba0f3c21c0f0a7fef8efa225d73e043f159f08efdc566e633d300d25948e319f Apr 17 17:30:51.332131 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:51.331999 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7qlln"] Apr 17 17:30:51.334144 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:30:51.334122 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf44e5ad7_86d0_42d5_b866_9890164ad1e7.slice/crio-144162e49c8fabab339532eb7a9e4eb59e2de1e1a57a958bc1f39ac08cb22952 WatchSource:0}: Error finding container 144162e49c8fabab339532eb7a9e4eb59e2de1e1a57a958bc1f39ac08cb22952: Status 404 returned error can't find the container with id 144162e49c8fabab339532eb7a9e4eb59e2de1e1a57a958bc1f39ac08cb22952 Apr 17 17:30:52.097395 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:52.097353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-8r4gs" event={"ID":"799dd967-d8a8-4828-b4de-a2e33078cece","Type":"ContainerStarted","Data":"ba0f3c21c0f0a7fef8efa225d73e043f159f08efdc566e633d300d25948e319f"} Apr 17 17:30:52.099145 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:52.099114 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7qlln" event={"ID":"f44e5ad7-86d0-42d5-b866-9890164ad1e7","Type":"ContainerStarted","Data":"144162e49c8fabab339532eb7a9e4eb59e2de1e1a57a958bc1f39ac08cb22952"} Apr 17 17:30:55.114265 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:55.114222 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-8r4gs" event={"ID":"799dd967-d8a8-4828-b4de-a2e33078cece","Type":"ContainerStarted","Data":"c2a26ab8f4b280d4dd59ce040235faff290cf8265ce89bdc06f3b2c254eb06f4"} Apr 17 17:30:55.114705 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:55.114291 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:30:55.115601 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:55.115574 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7qlln" event={"ID":"f44e5ad7-86d0-42d5-b866-9890164ad1e7","Type":"ContainerStarted","Data":"4df01af589e86abac4746e7049e8321298f4f4b8b83bbc33c9502ac521e9866c"} Apr 17 17:30:55.115725 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:55.115695 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:30:55.135372 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:55.135327 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-8r4gs" podStartSLOduration=1.947311185 podStartE2EDuration="5.135309007s" podCreationTimestamp="2026-04-17 17:30:50 +0000 UTC" firstStartedPulling="2026-04-17 17:30:51.318951287 +0000 UTC m=+346.986959833" lastFinishedPulling="2026-04-17 17:30:54.506949095 +0000 UTC m=+350.174957655" observedRunningTime="2026-04-17 17:30:55.134452368 +0000 UTC m=+350.802460925" watchObservedRunningTime="2026-04-17 17:30:55.135309007 +0000 UTC m=+350.803317572" Apr 17 17:30:55.154184 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:30:55.154126 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-7qlln" podStartSLOduration=1.956412957 podStartE2EDuration="5.154107196s" podCreationTimestamp="2026-04-17 17:30:50 +0000 UTC" firstStartedPulling="2026-04-17 17:30:51.335421526 +0000 UTC m=+347.003430071" lastFinishedPulling="2026-04-17 17:30:54.533115763 +0000 UTC m=+350.201124310" observedRunningTime="2026-04-17 17:30:55.153987764 +0000 UTC m=+350.821996332" watchObservedRunningTime="2026-04-17 17:30:55.154107196 +0000 UTC m=+350.822115755" Apr 17 17:31:06.121027 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:06.120992 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-7qlln" Apr 17 17:31:06.123014 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:06.122993 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-8r4gs" Apr 17 17:31:13.454174 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.454143 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56585d5cbc-7p8hq"] Apr 17 17:31:13.461242 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.461220 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.472230 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.472207 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56585d5cbc-7p8hq"] Apr 17 17:31:13.578186 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.578155 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-config\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.578298 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.578203 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-oauth-config\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.578365 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.578309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-oauth-serving-cert\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.578365 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.578344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-trusted-ca-bundle\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.578477 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.578376 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-service-ca\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.578477 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.578400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-serving-cert\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.578477 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.578461 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2tdt\" (UniqueName: \"kubernetes.io/projected/a7e392a1-5e5b-4925-9ab0-898dc74cffde-kube-api-access-k2tdt\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.679508 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.679476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-trusted-ca-bundle\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.679639 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.679528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-service-ca\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.679639 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.679548 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-serving-cert\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.679639 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.679563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2tdt\" (UniqueName: \"kubernetes.io/projected/a7e392a1-5e5b-4925-9ab0-898dc74cffde-kube-api-access-k2tdt\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.679639 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.679599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-config\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.679639 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.679627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-oauth-config\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.679888 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.679653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-oauth-serving-cert\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.680257 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.680233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-service-ca\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.680355 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.680312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-oauth-serving-cert\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.680474 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.680453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-trusted-ca-bundle\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.680521 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.680479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-config\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.682068 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.682042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-serving-cert\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.682183 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.682162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a7e392a1-5e5b-4925-9ab0-898dc74cffde-console-oauth-config\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.687925 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.687905 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2tdt\" (UniqueName: \"kubernetes.io/projected/a7e392a1-5e5b-4925-9ab0-898dc74cffde-kube-api-access-k2tdt\") pod \"console-56585d5cbc-7p8hq\" (UID: \"a7e392a1-5e5b-4925-9ab0-898dc74cffde\") " pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.770292 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.770234 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:13.891115 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:13.891085 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56585d5cbc-7p8hq"] Apr 17 17:31:13.893388 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:31:13.893364 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e392a1_5e5b_4925_9ab0_898dc74cffde.slice/crio-6684ba6b03b0b8025e17fa25658da165aec3b369f69cb1d7c3360e9384f3e739 WatchSource:0}: Error finding container 6684ba6b03b0b8025e17fa25658da165aec3b369f69cb1d7c3360e9384f3e739: Status 404 returned error can't find the container with id 6684ba6b03b0b8025e17fa25658da165aec3b369f69cb1d7c3360e9384f3e739 Apr 17 17:31:14.173844 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:14.173781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56585d5cbc-7p8hq" event={"ID":"a7e392a1-5e5b-4925-9ab0-898dc74cffde","Type":"ContainerStarted","Data":"7fab127da4b0ac4be253709dcd0d164dc03b7abee87d41e2f9bbf9d79df363c0"} Apr 17 17:31:14.174018 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:14.173851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56585d5cbc-7p8hq" event={"ID":"a7e392a1-5e5b-4925-9ab0-898dc74cffde","Type":"ContainerStarted","Data":"6684ba6b03b0b8025e17fa25658da165aec3b369f69cb1d7c3360e9384f3e739"} Apr 17 17:31:14.197895 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:14.197839 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56585d5cbc-7p8hq" podStartSLOduration=1.197804665 podStartE2EDuration="1.197804665s" podCreationTimestamp="2026-04-17 17:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:31:14.19570214 +0000 UTC m=+369.863710716" watchObservedRunningTime="2026-04-17 17:31:14.197804665 +0000 UTC m=+369.865813229" Apr 17 17:31:17.800959 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:17.800929 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v"] Apr 17 17:31:17.804163 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:17.804146 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:17.806480 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:17.806461 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 17:31:17.812142 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:17.812122 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v"] Apr 17 17:31:17.917365 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:17.917333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq88v\" (UID: \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:17.917553 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:17.917451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r2kn\" (UniqueName: \"kubernetes.io/projected/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-kube-api-access-6r2kn\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq88v\" (UID: \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:18.018744 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:18.018713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r2kn\" (UniqueName: \"kubernetes.io/projected/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-kube-api-access-6r2kn\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq88v\" (UID: \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:18.018847 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:18.018752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq88v\" (UID: \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:18.019180 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:18.019159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq88v\" (UID: \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:18.027238 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:18.027211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r2kn\" (UniqueName: \"kubernetes.io/projected/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-kube-api-access-6r2kn\") pod \"seaweedfs-tls-custom-ddd4dbfd-vq88v\" (UID: \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:18.113960 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:18.113873 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:18.231002 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:18.230979 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v"] Apr 17 17:31:18.233503 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:31:18.233475 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b1ba30_5b8a_4e42_90d0_d3611bfe39ca.slice/crio-4000c4f95a2a352aeea4618c0f9382a68dcc8cf958fdc6e30c02324eae32656e WatchSource:0}: Error finding container 4000c4f95a2a352aeea4618c0f9382a68dcc8cf958fdc6e30c02324eae32656e: Status 404 returned error can't find the container with id 4000c4f95a2a352aeea4618c0f9382a68dcc8cf958fdc6e30c02324eae32656e Apr 17 17:31:19.193608 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:19.193573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" event={"ID":"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca","Type":"ContainerStarted","Data":"cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb"} Apr 17 17:31:19.193608 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:19.193613 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" event={"ID":"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca","Type":"ContainerStarted","Data":"4000c4f95a2a352aeea4618c0f9382a68dcc8cf958fdc6e30c02324eae32656e"} Apr 17 17:31:19.219070 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:19.219025 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" podStartSLOduration=1.974089486 podStartE2EDuration="2.218998193s" podCreationTimestamp="2026-04-17 17:31:17 +0000 UTC" firstStartedPulling="2026-04-17 17:31:18.234882395 +0000 UTC m=+373.902890955" lastFinishedPulling="2026-04-17 17:31:18.479791117 +0000 UTC m=+374.147799662" observedRunningTime="2026-04-17 17:31:19.217927155 +0000 UTC m=+374.885935723" watchObservedRunningTime="2026-04-17 17:31:19.218998193 +0000 UTC m=+374.887006760" Apr 17 17:31:21.065079 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:21.065044 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v"] Apr 17 17:31:21.199960 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:21.199923 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" podUID="e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca" containerName="seaweedfs-tls-custom" containerID="cri-o://cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb" gracePeriod=30 Apr 17 17:31:23.771121 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:23.771083 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:23.771504 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:23.771135 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:23.775616 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:23.775596 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:24.213682 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:24.213651 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56585d5cbc-7p8hq" Apr 17 17:31:24.275946 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:24.275916 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-964f886d4-458t4"] Apr 17 17:31:49.145882 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.145855 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:49.249356 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.249274 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r2kn\" (UniqueName: \"kubernetes.io/projected/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-kube-api-access-6r2kn\") pod \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\" (UID: \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\") " Apr 17 17:31:49.249356 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.249326 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-data\") pod \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\" (UID: \"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca\") " Apr 17 17:31:49.250745 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.250717 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-data" (OuterVolumeSpecName: "data") pod "e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca" (UID: "e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:31:49.251359 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.251334 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-kube-api-access-6r2kn" (OuterVolumeSpecName: "kube-api-access-6r2kn") pod "e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca" (UID: "e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca"). InnerVolumeSpecName "kube-api-access-6r2kn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:49.291333 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.291302 2573 generic.go:358] "Generic (PLEG): container finished" podID="e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca" containerID="cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb" exitCode=0 Apr 17 17:31:49.291519 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.291360 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" Apr 17 17:31:49.291519 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.291388 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" event={"ID":"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca","Type":"ContainerDied","Data":"cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb"} Apr 17 17:31:49.291519 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.291455 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v" event={"ID":"e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca","Type":"ContainerDied","Data":"4000c4f95a2a352aeea4618c0f9382a68dcc8cf958fdc6e30c02324eae32656e"} Apr 17 17:31:49.291519 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.291473 2573 scope.go:117] "RemoveContainer" containerID="cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb" Apr 17 17:31:49.295274 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.295226 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-964f886d4-458t4" podUID="ae91566e-7930-4d06-9bb2-1f6e940120f1" containerName="console" containerID="cri-o://2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113" gracePeriod=15 Apr 17 17:31:49.301837 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.301814 2573 scope.go:117] "RemoveContainer" containerID="cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb" Apr 17 17:31:49.302177 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:31:49.302153 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb\": container with ID starting with cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb not found: ID does not exist" containerID="cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb" Apr 17 17:31:49.302243 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.302190 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb"} err="failed to get container status \"cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb\": rpc error: code = NotFound desc = could not find container \"cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb\": container with ID starting with cbaa432a40db32352740b21ed1cf2bc4290bb5539724e8a1d33587c53835b8fb not found: ID does not exist" Apr 17 17:31:49.313343 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.313319 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v"] Apr 17 17:31:49.317341 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.317315 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-vq88v"] Apr 17 17:31:49.349375 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.349339 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd"] Apr 17 17:31:49.349759 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.349740 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca" containerName="seaweedfs-tls-custom" Apr 17 17:31:49.349853 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.349762 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca" containerName="seaweedfs-tls-custom" Apr 17 17:31:49.349906 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.349879 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca" containerName="seaweedfs-tls-custom" Apr 17 17:31:49.350984 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.350960 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6r2kn\" (UniqueName: \"kubernetes.io/projected/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-kube-api-access-6r2kn\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.350984 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.350985 2573 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca-data\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.354491 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.354472 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.356912 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.356895 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 17 17:31:49.357008 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.356911 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 17 17:31:49.362355 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.362332 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd"] Apr 17 17:31:49.451389 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.451355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/71a9643e-7f20-4090-8c48-cf75af824826-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.451567 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.451441 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/71a9643e-7f20-4090-8c48-cf75af824826-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.451567 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.451470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np52q\" (UniqueName: \"kubernetes.io/projected/71a9643e-7f20-4090-8c48-cf75af824826-kube-api-access-np52q\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.520592 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.520569 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-964f886d4-458t4_ae91566e-7930-4d06-9bb2-1f6e940120f1/console/0.log" Apr 17 17:31:49.520729 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.520639 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-964f886d4-458t4" Apr 17 17:31:49.552027 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552001 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-serving-cert\") pod \"ae91566e-7930-4d06-9bb2-1f6e940120f1\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " Apr 17 17:31:49.552178 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552048 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-config\") pod \"ae91566e-7930-4d06-9bb2-1f6e940120f1\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " Apr 17 17:31:49.552178 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552086 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-oauth-config\") pod \"ae91566e-7930-4d06-9bb2-1f6e940120f1\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " Apr 17 17:31:49.552178 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552114 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-service-ca\") pod \"ae91566e-7930-4d06-9bb2-1f6e940120f1\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " Apr 17 17:31:49.552309 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552268 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-trusted-ca-bundle\") pod \"ae91566e-7930-4d06-9bb2-1f6e940120f1\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " Apr 17 17:31:49.552410 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552321 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrzqx\" (UniqueName: \"kubernetes.io/projected/ae91566e-7930-4d06-9bb2-1f6e940120f1-kube-api-access-rrzqx\") pod \"ae91566e-7930-4d06-9bb2-1f6e940120f1\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " Apr 17 17:31:49.552492 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552479 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-oauth-serving-cert\") pod \"ae91566e-7930-4d06-9bb2-1f6e940120f1\" (UID: \"ae91566e-7930-4d06-9bb2-1f6e940120f1\") " Apr 17 17:31:49.552549 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552523 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-service-ca" (OuterVolumeSpecName: "service-ca") pod "ae91566e-7930-4d06-9bb2-1f6e940120f1" (UID: "ae91566e-7930-4d06-9bb2-1f6e940120f1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:49.552635 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-np52q\" (UniqueName: \"kubernetes.io/projected/71a9643e-7f20-4090-8c48-cf75af824826-kube-api-access-np52q\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.552746 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552701 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/71a9643e-7f20-4090-8c48-cf75af824826-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.552746 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552711 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-config" (OuterVolumeSpecName: "console-config") pod "ae91566e-7930-4d06-9bb2-1f6e940120f1" (UID: "ae91566e-7930-4d06-9bb2-1f6e940120f1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:49.552872 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552798 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ae91566e-7930-4d06-9bb2-1f6e940120f1" (UID: "ae91566e-7930-4d06-9bb2-1f6e940120f1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:49.553170 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.553141 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ae91566e-7930-4d06-9bb2-1f6e940120f1" (UID: "ae91566e-7930-4d06-9bb2-1f6e940120f1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:49.553289 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.552833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/71a9643e-7f20-4090-8c48-cf75af824826-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.553289 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.553253 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-trusted-ca-bundle\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.553289 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.553270 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-oauth-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.553289 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.553285 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.553545 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.553301 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ae91566e-7930-4d06-9bb2-1f6e940120f1-service-ca\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.553545 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.553395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/71a9643e-7f20-4090-8c48-cf75af824826-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.554788 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.554763 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ae91566e-7930-4d06-9bb2-1f6e940120f1" (UID: "ae91566e-7930-4d06-9bb2-1f6e940120f1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:31:49.555019 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.554993 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae91566e-7930-4d06-9bb2-1f6e940120f1-kube-api-access-rrzqx" (OuterVolumeSpecName: "kube-api-access-rrzqx") pod "ae91566e-7930-4d06-9bb2-1f6e940120f1" (UID: "ae91566e-7930-4d06-9bb2-1f6e940120f1"). InnerVolumeSpecName "kube-api-access-rrzqx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:49.555085 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.555002 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ae91566e-7930-4d06-9bb2-1f6e940120f1" (UID: "ae91566e-7930-4d06-9bb2-1f6e940120f1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:31:49.555514 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.555497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/71a9643e-7f20-4090-8c48-cf75af824826-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.561521 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.561492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-np52q\" (UniqueName: \"kubernetes.io/projected/71a9643e-7f20-4090-8c48-cf75af824826-kube-api-access-np52q\") pod \"seaweedfs-tls-custom-5c88b85bb7-c7kvd\" (UID: \"71a9643e-7f20-4090-8c48-cf75af824826\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.654720 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.654680 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rrzqx\" (UniqueName: \"kubernetes.io/projected/ae91566e-7930-4d06-9bb2-1f6e940120f1-kube-api-access-rrzqx\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.654720 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.654715 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-serving-cert\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.654720 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.654730 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ae91566e-7930-4d06-9bb2-1f6e940120f1-console-oauth-config\") on node \"ip-10-0-131-192.ec2.internal\" DevicePath \"\"" Apr 17 17:31:49.670636 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.670601 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" Apr 17 17:31:49.797051 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:49.797021 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd"] Apr 17 17:31:49.799532 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:31:49.799506 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a9643e_7f20_4090_8c48_cf75af824826.slice/crio-34f9d408e735556a494ae6a4e4af0aaa4933b7bb6bd8fdab4fafdf5c6e53e3bb WatchSource:0}: Error finding container 34f9d408e735556a494ae6a4e4af0aaa4933b7bb6bd8fdab4fafdf5c6e53e3bb: Status 404 returned error can't find the container with id 34f9d408e735556a494ae6a4e4af0aaa4933b7bb6bd8fdab4fafdf5c6e53e3bb Apr 17 17:31:50.297454 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.297400 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" event={"ID":"71a9643e-7f20-4090-8c48-cf75af824826","Type":"ContainerStarted","Data":"76674607e35071833ba1c26586ceba3ba87408df35031eb37295da630f682e01"} Apr 17 17:31:50.297454 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.297461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" event={"ID":"71a9643e-7f20-4090-8c48-cf75af824826","Type":"ContainerStarted","Data":"34f9d408e735556a494ae6a4e4af0aaa4933b7bb6bd8fdab4fafdf5c6e53e3bb"} Apr 17 17:31:50.298576 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.298559 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-964f886d4-458t4_ae91566e-7930-4d06-9bb2-1f6e940120f1/console/0.log" Apr 17 17:31:50.298633 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.298594 2573 generic.go:358] "Generic (PLEG): container finished" podID="ae91566e-7930-4d06-9bb2-1f6e940120f1" containerID="2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113" exitCode=2 Apr 17 17:31:50.298633 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.298617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-964f886d4-458t4" event={"ID":"ae91566e-7930-4d06-9bb2-1f6e940120f1","Type":"ContainerDied","Data":"2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113"} Apr 17 17:31:50.298699 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.298649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-964f886d4-458t4" event={"ID":"ae91566e-7930-4d06-9bb2-1f6e940120f1","Type":"ContainerDied","Data":"641bb2048fa55621da9d4bcb1b6513f1115b29bdc936506288eef130f604b2a4"} Apr 17 17:31:50.298699 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.298653 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-964f886d4-458t4" Apr 17 17:31:50.298699 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.298664 2573 scope.go:117] "RemoveContainer" containerID="2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113" Apr 17 17:31:50.306679 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.306658 2573 scope.go:117] "RemoveContainer" containerID="2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113" Apr 17 17:31:50.306929 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:31:50.306909 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113\": container with ID starting with 2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113 not found: ID does not exist" containerID="2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113" Apr 17 17:31:50.306995 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.306942 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113"} err="failed to get container status \"2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113\": rpc error: code = NotFound desc = could not find container \"2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113\": container with ID starting with 2eac1c2fcd1676c92cf9c48c659d7d9a08de83df28f77adbf23665c9b8c52113 not found: ID does not exist" Apr 17 17:31:50.315495 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.315396 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-c7kvd" podStartSLOduration=1.051793096 podStartE2EDuration="1.315383893s" podCreationTimestamp="2026-04-17 17:31:49 +0000 UTC" firstStartedPulling="2026-04-17 17:31:49.800742674 +0000 UTC m=+405.468751218" lastFinishedPulling="2026-04-17 17:31:50.064333467 +0000 UTC m=+405.732342015" observedRunningTime="2026-04-17 17:31:50.31326505 +0000 UTC m=+405.981273641" watchObservedRunningTime="2026-04-17 17:31:50.315383893 +0000 UTC m=+405.983392459" Apr 17 17:31:50.329909 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.329881 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-964f886d4-458t4"] Apr 17 17:31:50.338390 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.338363 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-964f886d4-458t4"] Apr 17 17:31:50.924447 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.924393 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae91566e-7930-4d06-9bb2-1f6e940120f1" path="/var/lib/kubelet/pods/ae91566e-7930-4d06-9bb2-1f6e940120f1/volumes" Apr 17 17:31:50.924782 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:50.924769 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca" path="/var/lib/kubelet/pods/e9b1ba30-5b8a-4e42-90d0-d3611bfe39ca/volumes" Apr 17 17:31:57.772123 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.772087 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq"] Apr 17 17:31:57.772608 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.772411 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae91566e-7930-4d06-9bb2-1f6e940120f1" containerName="console" Apr 17 17:31:57.772608 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.772421 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae91566e-7930-4d06-9bb2-1f6e940120f1" containerName="console" Apr 17 17:31:57.772608 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.772498 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae91566e-7930-4d06-9bb2-1f6e940120f1" containerName="console" Apr 17 17:31:57.776522 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.776505 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:57.778981 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.778960 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 17 17:31:57.779104 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.779030 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 17 17:31:57.784637 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.784615 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq"] Apr 17 17:31:57.825861 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.825816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9cq8\" (UniqueName: \"kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-kube-api-access-c9cq8\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:57.825861 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.825865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:57.826098 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.825891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7be1137c-1057-4133-a6ee-b9b523207434-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:57.926331 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.926290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9cq8\" (UniqueName: \"kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-kube-api-access-c9cq8\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:57.926331 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.926335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:57.926599 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:31:57.926422 2573 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 17 17:31:57.926599 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:31:57.926463 2573 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq: secret "seaweedfs-tls-serving" not found Apr 17 17:31:57.926599 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.926479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7be1137c-1057-4133-a6ee-b9b523207434-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:57.926599 ip-10-0-131-192 kubenswrapper[2573]: E0417 17:31:57.926517 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-seaweedfs-tls-serving podName:7be1137c-1057-4133-a6ee-b9b523207434 nodeName:}" failed. No retries permitted until 2026-04-17 17:31:58.426497327 +0000 UTC m=+414.094505873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-sj8hq" (UID: "7be1137c-1057-4133-a6ee-b9b523207434") : secret "seaweedfs-tls-serving" not found Apr 17 17:31:57.926811 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.926789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7be1137c-1057-4133-a6ee-b9b523207434-data\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:57.937514 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:57.937495 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9cq8\" (UniqueName: \"kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-kube-api-access-c9cq8\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:58.430669 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:58.430622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:58.432988 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:58.432961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/7be1137c-1057-4133-a6ee-b9b523207434-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-sj8hq\" (UID: \"7be1137c-1057-4133-a6ee-b9b523207434\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:58.686271 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:58.686167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" Apr 17 17:31:58.824954 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:58.824906 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq"] Apr 17 17:31:58.827107 ip-10-0-131-192 kubenswrapper[2573]: W0417 17:31:58.827077 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be1137c_1057_4133_a6ee_b9b523207434.slice/crio-11ab272e016f5f49289bda7f0a9a5e9bc010a4dadfd40411cd1737ccde39c2f9 WatchSource:0}: Error finding container 11ab272e016f5f49289bda7f0a9a5e9bc010a4dadfd40411cd1737ccde39c2f9: Status 404 returned error can't find the container with id 11ab272e016f5f49289bda7f0a9a5e9bc010a4dadfd40411cd1737ccde39c2f9 Apr 17 17:31:59.329848 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:59.329749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" event={"ID":"7be1137c-1057-4133-a6ee-b9b523207434","Type":"ContainerStarted","Data":"5d530b7fa17f6892cad0a9bb7fb1eb95ad26c121d111c72810113df81a65a602"} Apr 17 17:31:59.329848 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:59.329797 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" event={"ID":"7be1137c-1057-4133-a6ee-b9b523207434","Type":"ContainerStarted","Data":"11ab272e016f5f49289bda7f0a9a5e9bc010a4dadfd40411cd1737ccde39c2f9"} Apr 17 17:31:59.348232 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:31:59.348180 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-sj8hq" podStartSLOduration=2.098927308 podStartE2EDuration="2.348166233s" podCreationTimestamp="2026-04-17 17:31:57 +0000 UTC" firstStartedPulling="2026-04-17 17:31:58.828252303 +0000 UTC m=+414.496260848" lastFinishedPulling="2026-04-17 17:31:59.077491225 +0000 UTC m=+414.745499773" observedRunningTime="2026-04-17 17:31:59.346662913 +0000 UTC m=+415.014671480" watchObservedRunningTime="2026-04-17 17:31:59.348166233 +0000 UTC m=+415.016174799" Apr 17 17:35:04.829311 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:35:04.829284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:35:04.831311 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:35:04.831289 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:40:04.850451 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:40:04.850354 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:40:04.853500 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:40:04.853477 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:45:04.872542 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:45:04.872511 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:45:04.875164 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:45:04.875145 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:50:04.895880 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:50:04.895848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:50:04.899557 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:50:04.899535 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:55:04.916269 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:55:04.916233 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 17:55:04.921610 ip-10-0-131-192 kubenswrapper[2573]: I0417 17:55:04.921585 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:00:04.937844 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:00:04.937816 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:00:04.943730 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:00:04.943709 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:05:04.959194 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:05:04.959166 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:05:04.968516 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:05:04.968489 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:10:04.982113 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:10:04.982087 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:10:04.990498 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:10:04.990480 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:15:05.004134 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:15:05.004025 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:15:05.012026 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:15:05.012005 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:20:05.024400 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:20:05.024291 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:20:05.033563 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:20:05.033543 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:25:05.046884 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:25:05.046760 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:25:05.056393 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:25:05.056370 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:28:25.778648 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:25.778568 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-b4bv4_7bea17df-5d4f-4d36-a769-2336b30abea8/global-pull-secret-syncer/0.log" Apr 17 18:28:25.916648 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:25.916619 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-c87qk_d7db7d0e-fe44-4b7a-9633-00ced1914759/konnectivity-agent/0.log" Apr 17 18:28:26.011343 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:26.011312 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-192.ec2.internal_e647b5a59d131d511f2d693e03bddab4/haproxy/0.log" Apr 17 18:28:29.657594 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:29.657565 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6f8dbf5fdc-sfvf8_90933178-05d1-4eff-9f6c-a4b256cf0f68/metrics-server/0.log" Apr 17 18:28:29.686568 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:29.686539 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4d8sx_39b7ffe1-d40b-4c40-8bfa-72243d49873b/monitoring-plugin/0.log" Apr 17 18:28:29.810125 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:29.810100 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d69mc_686baeef-d269-4500-9e3f-aed10f7f1c7d/node-exporter/0.log" Apr 17 18:28:29.838437 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:29.838409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d69mc_686baeef-d269-4500-9e3f-aed10f7f1c7d/kube-rbac-proxy/0.log" Apr 17 18:28:29.863140 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:29.863111 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-d69mc_686baeef-d269-4500-9e3f-aed10f7f1c7d/init-textfile/0.log" Apr 17 18:28:30.249851 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.249807 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-xt4mn_f401d650-f1a7-427c-9b26-1fbb623c4372/prometheus-operator/0.log" Apr 17 18:28:30.273249 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.273225 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-xt4mn_f401d650-f1a7-427c-9b26-1fbb623c4372/kube-rbac-proxy/0.log" Apr 17 18:28:30.302761 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.302733 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-c6jkp_256258e8-cdda-4bc4-9dd8-63bc774b6915/prometheus-operator-admission-webhook/0.log" Apr 17 18:28:30.410510 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.410483 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-669776895b-schm7_9a0915b4-8767-4493-ac81-7885fb3dd23a/thanos-query/0.log" Apr 17 18:28:30.449527 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.449500 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-669776895b-schm7_9a0915b4-8767-4493-ac81-7885fb3dd23a/kube-rbac-proxy-web/0.log" Apr 17 18:28:30.478273 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.478243 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-669776895b-schm7_9a0915b4-8767-4493-ac81-7885fb3dd23a/kube-rbac-proxy/0.log" Apr 17 18:28:30.514342 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.514272 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-669776895b-schm7_9a0915b4-8767-4493-ac81-7885fb3dd23a/prom-label-proxy/0.log" Apr 17 18:28:30.560210 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.560185 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-669776895b-schm7_9a0915b4-8767-4493-ac81-7885fb3dd23a/kube-rbac-proxy-rules/0.log" Apr 17 18:28:30.593840 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:30.593816 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-669776895b-schm7_9a0915b4-8767-4493-ac81-7885fb3dd23a/kube-rbac-proxy-metrics/0.log" Apr 17 18:28:32.484898 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.484865 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56585d5cbc-7p8hq_a7e392a1-5e5b-4925-9ab0-898dc74cffde/console/0.log" Apr 17 18:28:32.803789 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.803707 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2"] Apr 17 18:28:32.807058 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.807036 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:32.809331 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.809310 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2d5hj\"/\"openshift-service-ca.crt\"" Apr 17 18:28:32.809456 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.809310 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2d5hj\"/\"kube-root-ca.crt\"" Apr 17 18:28:32.810075 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.810061 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2d5hj\"/\"default-dockercfg-gs5n6\"" Apr 17 18:28:32.816649 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.816623 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2"] Apr 17 18:28:32.905852 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.905814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-podres\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:32.905852 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.905861 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-sys\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:32.906086 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.905885 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-lib-modules\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:32.906086 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.905924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzmm\" (UniqueName: \"kubernetes.io/projected/9b448863-7f68-4b59-8f9d-28c93376d2a0-kube-api-access-bzzmm\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:32.906086 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:32.905948 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-proc\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006448 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-sys\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006644 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006455 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-lib-modules\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006644 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzmm\" (UniqueName: \"kubernetes.io/projected/9b448863-7f68-4b59-8f9d-28c93376d2a0-kube-api-access-bzzmm\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006644 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-sys\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006644 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-proc\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006644 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-lib-modules\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006881 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-podres\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006881 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-proc\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.006881 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.006747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b448863-7f68-4b59-8f9d-28c93376d2a0-podres\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.014475 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.014450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzmm\" (UniqueName: \"kubernetes.io/projected/9b448863-7f68-4b59-8f9d-28c93376d2a0-kube-api-access-bzzmm\") pod \"perf-node-gather-daemonset-d45h2\" (UID: \"9b448863-7f68-4b59-8f9d-28c93376d2a0\") " pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.117606 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.117514 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:33.240803 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.240777 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2"] Apr 17 18:28:33.243730 ip-10-0-131-192 kubenswrapper[2573]: W0417 18:28:33.243696 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9b448863_7f68_4b59_8f9d_28c93376d2a0.slice/crio-a18518906bb7375aa7264b71a96bba7a16400c8d3c96d662c677c4014a21f41a WatchSource:0}: Error finding container a18518906bb7375aa7264b71a96bba7a16400c8d3c96d662c677c4014a21f41a: Status 404 returned error can't find the container with id a18518906bb7375aa7264b71a96bba7a16400c8d3c96d662c677c4014a21f41a Apr 17 18:28:33.245311 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.245292 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:28:33.329953 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.329918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" event={"ID":"9b448863-7f68-4b59-8f9d-28c93376d2a0","Type":"ContainerStarted","Data":"a18518906bb7375aa7264b71a96bba7a16400c8d3c96d662c677c4014a21f41a"} Apr 17 18:28:33.753176 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.753140 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zz2w5_69aceb1b-03e3-432d-9fbc-1fc05822fa8a/dns/0.log" Apr 17 18:28:33.776305 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.776282 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zz2w5_69aceb1b-03e3-432d-9fbc-1fc05822fa8a/kube-rbac-proxy/0.log" Apr 17 18:28:33.799775 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:33.799749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-plmjc_9cfa6ba8-721d-4b42-963e-828ffe17cdbd/dns-node-resolver/0.log" Apr 17 18:28:34.330870 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:34.330829 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gcsv8_3c913c62-8158-4429-8205-ec0b912f3e95/node-ca/0.log" Apr 17 18:28:34.333721 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:34.333695 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" event={"ID":"9b448863-7f68-4b59-8f9d-28c93376d2a0","Type":"ContainerStarted","Data":"ebedad2a595da4e2472226f74abaf605bbe126b4762d5bc09851361316ce387b"} Apr 17 18:28:34.333849 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:34.333787 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:34.348858 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:34.348818 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" podStartSLOduration=2.348803304 podStartE2EDuration="2.348803304s" podCreationTimestamp="2026-04-17 18:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:28:34.348691228 +0000 UTC m=+3810.016699807" watchObservedRunningTime="2026-04-17 18:28:34.348803304 +0000 UTC m=+3810.016811874" Apr 17 18:28:35.438763 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:35.438732 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zjcw2_98ef03b7-627d-4085-85f1-8f4765fe1c45/serve-healthcheck-canary/0.log" Apr 17 18:28:35.878372 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:35.878301 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w7lff_3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b/kube-rbac-proxy/0.log" Apr 17 18:28:35.900516 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:35.900482 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w7lff_3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b/exporter/0.log" Apr 17 18:28:35.921276 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:35.921252 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-w7lff_3cfa574d-b7a8-4bbc-9945-ef65d3a62e0b/extractor/0.log" Apr 17 18:28:37.944234 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:37.944200 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-85dd7cfb4d-vmdq7_f35f105b-7673-47da-a818-6b7fbb4aaedd/manager/0.log" Apr 17 18:28:37.965984 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:37.965952 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-8rdbp_6c477d6d-2b0b-4d77-a660-533d36daf2ee/manager/0.log" Apr 17 18:28:37.991813 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:37.991782 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-8r4gs_799dd967-d8a8-4828-b4de-a2e33078cece/server/0.log" Apr 17 18:28:38.203458 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:38.203355 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-7qlln_f44e5ad7-86d0-42d5-b866-9890164ad1e7/manager/0.log" Apr 17 18:28:38.364269 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:38.364228 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-jbd4x_ce68936e-3393-43fa-857a-68e9825e5fa1/seaweedfs/0.log" Apr 17 18:28:38.391178 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:38.391151 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-c7kvd_71a9643e-7f20-4090-8c48-cf75af824826/seaweedfs-tls-custom/0.log" Apr 17 18:28:38.432840 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:38.432807 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-sj8hq_7be1137c-1057-4133-a6ee-b9b523207434/seaweedfs-tls-serving/0.log" Apr 17 18:28:40.346581 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:40.346553 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2d5hj/perf-node-gather-daemonset-d45h2" Apr 17 18:28:43.891272 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:43.891244 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zj4h_5b573982-e564-43dc-809a-f117e117fa31/kube-multus-additional-cni-plugins/0.log" Apr 17 18:28:43.913075 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:43.913050 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zj4h_5b573982-e564-43dc-809a-f117e117fa31/egress-router-binary-copy/0.log" Apr 17 18:28:43.933748 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:43.933719 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zj4h_5b573982-e564-43dc-809a-f117e117fa31/cni-plugins/0.log" Apr 17 18:28:43.956156 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:43.956134 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zj4h_5b573982-e564-43dc-809a-f117e117fa31/bond-cni-plugin/0.log" Apr 17 18:28:43.977667 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:43.977644 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zj4h_5b573982-e564-43dc-809a-f117e117fa31/routeoverride-cni/0.log" Apr 17 18:28:44.000234 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:44.000201 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zj4h_5b573982-e564-43dc-809a-f117e117fa31/whereabouts-cni-bincopy/0.log" Apr 17 18:28:44.023548 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:44.023514 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zj4h_5b573982-e564-43dc-809a-f117e117fa31/whereabouts-cni/0.log" Apr 17 18:28:44.442011 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:44.441939 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pgqq4_6fd7c0bf-ef91-422c-8dc8-5bc7192ade41/kube-multus/0.log" Apr 17 18:28:44.508583 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:44.508554 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6hw86_54c39df0-963a-429e-b7e9-1cf754453932/network-metrics-daemon/0.log" Apr 17 18:28:44.532198 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:44.532171 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6hw86_54c39df0-963a-429e-b7e9-1cf754453932/kube-rbac-proxy/0.log" Apr 17 18:28:45.683239 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.683209 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-controller/0.log" Apr 17 18:28:45.702039 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.702010 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/0.log" Apr 17 18:28:45.717846 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.717821 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovn-acl-logging/1.log" Apr 17 18:28:45.737499 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.737476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/kube-rbac-proxy-node/0.log" Apr 17 18:28:45.761323 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.761295 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:28:45.781083 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.781053 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/northd/0.log" Apr 17 18:28:45.801244 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.801219 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/nbdb/0.log" Apr 17 18:28:45.820992 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.820968 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/sbdb/0.log" Apr 17 18:28:45.926814 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:45.926784 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hr974_7b50982c-98df-4df3-9669-7a741ea95eb6/ovnkube-controller/0.log" Apr 17 18:28:47.443791 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:47.443766 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-56d9d_3dbf031f-03a8-4194-a694-20fe7307d30f/network-check-target-container/0.log" Apr 17 18:28:48.389355 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:48.389331 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-j82wk_f717fb0f-c30a-4760-8df6-5eb06082af1b/iptables-alerter/0.log" Apr 17 18:28:49.143906 ip-10-0-131-192 kubenswrapper[2573]: I0417 18:28:49.143871 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-pj447_716e40c1-df03-46db-92f3-31f34b85f083/tuned/0.log"