Apr 17 16:17:22.336454 ip-10-0-136-214 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:17:22.336468 ip-10-0-136-214 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:17:22.336478 ip-10-0-136-214 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:17:22.336799 ip-10-0-136-214 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:17:32.572384 ip-10-0-136-214 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:17:32.572399 ip-10-0-136-214 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f3dae0c2141449c79a0c492934ff24c8 -- Apr 17 16:20:13.505528 ip-10-0-136-214 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:20:13.957489 ip-10-0-136-214 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:20:13.957489 ip-10-0-136-214 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:20:13.957489 ip-10-0-136-214 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:20:13.957489 ip-10-0-136-214 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:20:13.957489 ip-10-0-136-214 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:20:13.959920 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.959832 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:20:13.964310 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964286 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:20:13.964310 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964306 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:20:13.964310 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964311 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:20:13.964310 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964315 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:20:13.964310 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964319 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964322 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964325 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964328 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964331 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964335 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964337 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964340 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964343 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964346 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964349 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964352 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964355 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964358 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964360 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964363 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964366 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964369 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964372 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:20:13.964506 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964378 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964382 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964384 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964387 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964390 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964392 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964395 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964398 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964400 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964403 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964406 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964409 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964411 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964414 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964416 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964421 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964423 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964426 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964429 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964432 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:20:13.964959 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964435 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964437 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964441 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964444 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964446 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964449 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964452 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964454 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964456 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964459 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964462 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964464 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964468 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964471 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964475 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964479 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964483 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964486 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964488 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964491 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:20:13.965496 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964494 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964497 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964499 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964502 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964504 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964507 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964510 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964514 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964517 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964520 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964523 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964526 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964529 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964531 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964534 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964539 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964544 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964547 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964550 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:20:13.965972 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964553 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964556 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964559 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964562 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964965 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964971 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964974 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964977 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964980 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964983 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964986 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964988 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964991 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964994 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.964997 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965000 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965002 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965005 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965007 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965010 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:20:13.966438 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965013 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965016 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965018 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965021 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965024 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965026 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965029 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965032 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965035 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965038 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965040 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965043 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965045 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965048 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965050 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965053 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965055 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965058 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965060 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965068 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:20:13.966912 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965071 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965075 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965078 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965081 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965084 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965086 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965089 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965092 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965094 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965097 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965100 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965102 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965105 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965108 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965111 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965114 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965116 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965119 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965121 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:20:13.967420 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965124 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965126 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965129 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965132 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965134 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965136 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965139 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965141 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965146 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965149 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965152 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965155 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965158 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965160 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965163 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965165 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965168 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965170 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965173 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965175 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:20:13.967882 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965178 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965180 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965183 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965185 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965188 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965191 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965194 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965196 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965199 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965202 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.965204 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966004 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966013 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966024 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966029 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966034 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966038 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966042 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966048 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966051 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966054 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:20:13.968483 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966058 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966062 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966065 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966068 2569 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966071 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966074 2569 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966077 2569 flags.go:64] FLAG: --cloud-config="" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966080 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966083 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966088 2569 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966090 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966093 2569 flags.go:64] FLAG: --config-dir="" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966096 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966100 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966104 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966107 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966110 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966114 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966117 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966120 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966123 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966126 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966129 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966133 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966136 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:20:13.968996 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966139 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966142 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966146 2569 flags.go:64] FLAG: --enable-server="true" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966149 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966154 2569 flags.go:64] FLAG: --event-burst="100" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966157 2569 flags.go:64] FLAG: --event-qps="50" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966160 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966164 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966167 2569 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966171 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966174 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966177 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966180 2569 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966184 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966187 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966190 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966193 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966196 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966199 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966202 2569 flags.go:64] FLAG: --feature-gates="" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966207 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966209 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966213 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966216 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966219 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:20:13.969632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966222 2569 flags.go:64] FLAG: --help="false" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966239 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-136-214.ec2.internal" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966244 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966249 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966254 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966259 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966264 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966268 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966271 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966274 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966278 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966281 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966284 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966288 2569 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966291 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966294 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966297 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966300 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966303 2569 flags.go:64] FLAG: --lock-file="" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966306 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966309 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966312 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966318 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:20:13.970242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966321 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966324 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966327 2569 flags.go:64] FLAG: --logging-format="text" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966330 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966333 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966337 2569 flags.go:64] FLAG: --manifest-url="" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966340 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966344 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966347 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966351 2569 flags.go:64] FLAG: --max-pods="110" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966354 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966357 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966360 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966364 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966369 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966372 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966375 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966383 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966386 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966389 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966392 2569 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966395 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966401 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966404 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:20:13.970793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966407 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966410 2569 flags.go:64] FLAG: --port="10250" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966413 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966419 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0372b5c025859e80d" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966423 2569 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966426 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966429 2569 flags.go:64] FLAG: --register-node="true" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966432 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966434 2569 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966438 2569 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966441 2569 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966444 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966448 2569 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966451 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966455 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966458 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966461 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966463 2569 flags.go:64] FLAG: --runonce="false" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966466 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966470 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966472 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966475 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966480 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966483 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966486 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966489 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:20:13.971420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966492 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966495 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966497 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966501 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966505 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966508 2569 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966511 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966516 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966519 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966523 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966528 2569 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966531 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966534 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966537 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966540 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966543 2569 flags.go:64] FLAG: --v="2" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966547 2569 flags.go:64] FLAG: --version="false" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966552 2569 flags.go:64] FLAG: --vmodule="" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966556 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.966559 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966655 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966659 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966661 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966664 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:20:13.972036 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966667 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966670 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966673 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966675 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966679 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966681 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966684 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966686 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966689 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966691 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966694 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966697 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966700 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966702 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966705 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966707 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966710 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966713 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966718 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:20:13.972686 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966721 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966724 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966727 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966730 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966733 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966735 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966738 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966741 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966743 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966746 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966748 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966751 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966753 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966756 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966758 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966761 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966764 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966771 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966774 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966776 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:20:13.973177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966779 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966782 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966784 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966787 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966789 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966792 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966796 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966799 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966802 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966804 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966809 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966813 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966816 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966818 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966821 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966824 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966826 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966829 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966831 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:20:13.973700 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966834 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966837 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966839 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966842 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966844 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966847 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966849 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966852 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966855 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966857 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966861 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966864 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966866 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966869 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966871 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966873 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966876 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966879 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966882 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966884 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:20:13.974170 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966887 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966890 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966893 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.966896 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.967665 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.973917 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.974032 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974083 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974088 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974092 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974095 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974099 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974101 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974104 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974108 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974110 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:20:13.974679 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974113 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974116 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974120 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974122 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974125 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974127 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974130 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974133 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974136 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974138 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974142 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974147 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974150 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974152 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974155 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974158 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974161 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974163 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974166 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974169 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:20:13.975077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974171 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974174 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974178 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974182 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974185 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974187 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974191 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974195 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974198 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974201 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974204 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974207 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974210 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974212 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974215 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974218 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974221 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974240 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974244 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:20:13.975656 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974247 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974250 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974253 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974257 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974259 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974262 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974265 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974268 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974271 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974273 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974276 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974279 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974281 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974284 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974286 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974290 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974294 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974296 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974299 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974302 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:20:13.976157 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974305 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974307 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974310 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974312 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974315 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974318 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974321 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974323 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974325 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974328 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974330 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974333 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974335 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974338 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974340 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974343 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974345 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:20:13.976660 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974347 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.974353 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974452 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974456 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974459 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974462 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974465 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974468 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974471 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974473 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974476 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974479 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974482 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974485 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974488 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:20:13.977072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974491 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974493 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974496 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974498 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974501 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974503 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974507 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974511 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974514 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974517 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974519 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974522 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974525 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974527 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974530 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974532 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974535 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974537 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974540 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974542 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:20:13.977454 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974545 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974547 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974550 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974554 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974556 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974559 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974561 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974564 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974567 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974570 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974573 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974576 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974578 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974581 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974583 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974586 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974588 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974590 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974593 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:20:13.978000 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974596 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974598 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974601 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974603 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974606 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974608 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974610 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974613 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974615 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974618 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974620 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974623 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974625 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974629 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974632 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974634 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974637 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974640 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974642 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974645 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:20:13.978549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974648 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974650 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974653 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974655 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974658 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974661 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974663 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974666 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974668 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974671 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974673 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974676 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974679 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:13.974681 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.974686 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:20:13.979035 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.975503 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:20:13.979505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.977496 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:20:13.979505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.978325 2569 server.go:1019] "Starting client certificate rotation" Apr 17 16:20:13.979505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.978435 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:20:13.979505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:13.979155 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:20:14.001769 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.001745 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:20:14.004473 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.004451 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:20:14.019560 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.019528 2569 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:20:14.027564 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.027546 2569 log.go:25] "Validated CRI v1 image API" Apr 17 16:20:14.028851 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.028836 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:20:14.033653 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.033627 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:20:14.034796 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.034774 2569 fs.go:135] Filesystem UUIDs: map[0d6f24a5-8ada-45cd-9415-81356beaf531:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f50c1156-6979-4bdb-9e67-5865b9377ed2:/dev/nvme0n1p3] Apr 17 16:20:14.034850 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.034798 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:20:14.040440 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.040325 2569 manager.go:217] Machine: {Timestamp:2026-04-17 16:20:14.038417174 +0000 UTC m=+0.419045527 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098827 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d43f9f9513833ad032993b0af3f06 SystemUUID:ec2d43f9-f951-3833-ad03-2993b0af3f06 BootID:f3dae0c2-1414-49c7-9a0c-492934ff24c8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b8:7c:15:26:79 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b8:7c:15:26:79 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:96:86:d2:83:0e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:20:14.040440 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.040435 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:20:14.040580 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.040568 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:20:14.043000 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.042973 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:20:14.043141 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.043002 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-214.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:20:14.043183 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.043151 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:20:14.043183 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.043159 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:20:14.043183 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.043177 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:20:14.044722 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.044711 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:20:14.046109 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.046099 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:20:14.046216 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.046207 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:20:14.048450 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.048440 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:20:14.048486 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.048454 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:20:14.048486 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.048470 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:20:14.048486 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.048482 2569 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:20:14.048605 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.048492 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:20:14.049627 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.049613 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:20:14.049665 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.049639 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:20:14.053303 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.053285 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:20:14.054911 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.054898 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:20:14.055315 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.055295 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zjwcv" Apr 17 16:20:14.056638 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056625 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:20:14.056687 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056647 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:20:14.056687 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056658 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:20:14.056687 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056665 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:20:14.056687 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056679 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:20:14.056687 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056685 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:20:14.056814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056691 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:20:14.056814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056697 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:20:14.056814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056704 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:20:14.056814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056710 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:20:14.056814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056722 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:20:14.056814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.056732 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:20:14.058703 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.058686 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:20:14.058703 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.058705 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:20:14.061275 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.061258 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zjwcv" Apr 17 16:20:14.062757 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.062744 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:20:14.062848 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.062785 2569 server.go:1295] "Started kubelet" Apr 17 16:20:14.062959 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.062943 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-214.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:20:14.063019 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.062956 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:20:14.063084 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.063007 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:20:14.063084 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.063059 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:20:14.063176 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.063122 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-214.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:20:14.063221 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.063181 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:20:14.063676 ip-10-0-136-214 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:20:14.064350 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.064204 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:20:14.068591 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.068567 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:20:14.073473 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.073453 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:20:14.074621 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.074600 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:20:14.075144 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075128 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:20:14.075770 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075753 2569 factory.go:55] Registering systemd factory Apr 17 16:20:14.075847 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075779 2569 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:20:14.075847 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075757 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:20:14.075847 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075760 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:20:14.075847 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075830 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:20:14.075984 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075944 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:20:14.075984 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075953 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:20:14.075984 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075956 2569 factory.go:153] Registering CRI-O factory Apr 17 16:20:14.075984 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.075967 2569 factory.go:223] Registration of the crio container factory successfully Apr 17 16:20:14.076145 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.076006 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.076145 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.076030 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:20:14.076145 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.076050 2569 factory.go:103] Registering Raw factory Apr 17 16:20:14.076145 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.076064 2569 manager.go:1196] Started watching for new ooms in manager Apr 17 16:20:14.076587 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.076444 2569 manager.go:319] Starting recovery of all containers Apr 17 16:20:14.077469 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.077442 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:20:14.080513 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.080492 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-214.ec2.internal\" not found" node="ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.087059 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.087042 2569 manager.go:324] Recovery completed Apr 17 16:20:14.088321 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.088303 2569 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 16:20:14.091088 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.091076 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:20:14.094130 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.094113 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:20:14.094201 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.094144 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:20:14.094201 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.094158 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:20:14.094708 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.094696 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:20:14.094708 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.094708 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:20:14.094801 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.094724 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:20:14.097066 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.097054 2569 policy_none.go:49] "None policy: Start" Apr 17 16:20:14.097114 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.097070 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:20:14.097114 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.097080 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:20:14.141928 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.141906 2569 manager.go:341] "Starting Device Plugin manager" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.141975 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.141986 2569 server.go:85] "Starting device plugin registration server" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.142216 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.142243 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.142358 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.142439 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.142445 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.143372 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:20:14.143925 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.143408 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.210251 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.210148 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:20:14.211440 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.211423 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:20:14.211502 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.211453 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:20:14.211502 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.211475 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:20:14.211502 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.211482 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:20:14.211645 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.211574 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:20:14.213378 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.213354 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:20:14.243359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.243330 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:20:14.244470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.244453 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:20:14.244574 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.244481 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:20:14.244574 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.244492 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:20:14.244574 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.244516 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.253385 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.253360 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.253385 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.253385 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-214.ec2.internal\": node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.269182 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.269151 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.312322 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.312285 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal"] Apr 17 16:20:14.312420 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.312371 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:20:14.314049 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.314036 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:20:14.314126 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.314063 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:20:14.314126 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.314074 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:20:14.315280 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.315262 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:20:14.315439 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.315423 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.315509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.315458 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:20:14.316019 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.315995 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:20:14.316113 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.316028 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:20:14.316113 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.316053 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:20:14.316113 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.316068 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:20:14.316113 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.316032 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:20:14.316323 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.316117 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:20:14.317165 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.317151 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.317212 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.317184 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:20:14.318567 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.318166 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:20:14.318567 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.318190 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:20:14.318567 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.318203 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:20:14.332814 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.332793 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-214.ec2.internal\" not found" node="ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.337032 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.337015 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-214.ec2.internal\" not found" node="ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.370009 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.369978 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.377493 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.377465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fdc147543c8fe25f8dcbfbdfb16786d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal\" (UID: \"4fdc147543c8fe25f8dcbfbdfb16786d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.377620 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.377498 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06511d0037f371d29f77ad6b941c9dbf-config\") pod \"kube-apiserver-proxy-ip-10-0-136-214.ec2.internal\" (UID: \"06511d0037f371d29f77ad6b941c9dbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.377620 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.377524 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4fdc147543c8fe25f8dcbfbdfb16786d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal\" (UID: \"4fdc147543c8fe25f8dcbfbdfb16786d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.470595 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.470498 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.477889 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.477860 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4fdc147543c8fe25f8dcbfbdfb16786d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal\" (UID: \"4fdc147543c8fe25f8dcbfbdfb16786d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.478016 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.477896 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fdc147543c8fe25f8dcbfbdfb16786d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal\" (UID: \"4fdc147543c8fe25f8dcbfbdfb16786d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.478016 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.477923 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06511d0037f371d29f77ad6b941c9dbf-config\") pod \"kube-apiserver-proxy-ip-10-0-136-214.ec2.internal\" (UID: \"06511d0037f371d29f77ad6b941c9dbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.478016 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.477956 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4fdc147543c8fe25f8dcbfbdfb16786d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal\" (UID: \"4fdc147543c8fe25f8dcbfbdfb16786d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.478016 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.477964 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06511d0037f371d29f77ad6b941c9dbf-config\") pod \"kube-apiserver-proxy-ip-10-0-136-214.ec2.internal\" (UID: \"06511d0037f371d29f77ad6b941c9dbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.478016 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.477961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fdc147543c8fe25f8dcbfbdfb16786d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal\" (UID: \"4fdc147543c8fe25f8dcbfbdfb16786d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.571319 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.571277 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.634746 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.634720 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.639377 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.639359 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" Apr 17 16:20:14.671996 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.671962 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.772555 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.772472 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.873040 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.873006 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.973519 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:14.973483 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:14.977818 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.977800 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:20:14.977978 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.977960 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:20:14.978031 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:14.977983 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:20:15.063651 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.063574 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:15:14 +0000 UTC" deadline="2027-09-20 18:12:42.001787781 +0000 UTC" Apr 17 16:20:15.063651 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.063606 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12505h52m26.938183652s" Apr 17 16:20:15.074029 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:15.073995 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:15.075093 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.075079 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:20:15.086519 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.086492 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:20:15.104723 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.104697 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t4xbw" Apr 17 16:20:15.112779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.112751 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t4xbw" Apr 17 16:20:15.174317 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:15.174289 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:15.183303 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:15.183274 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fdc147543c8fe25f8dcbfbdfb16786d.slice/crio-8fc229ae66ceac653e7086b3c6bc55e9632bf00318a425e098a45a6c94e7a8a9 WatchSource:0}: Error finding container 8fc229ae66ceac653e7086b3c6bc55e9632bf00318a425e098a45a6c94e7a8a9: Status 404 returned error can't find the container with id 8fc229ae66ceac653e7086b3c6bc55e9632bf00318a425e098a45a6c94e7a8a9 Apr 17 16:20:15.183599 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:15.183581 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06511d0037f371d29f77ad6b941c9dbf.slice/crio-443c021fd5387e92e529476b64b9fa3e3a70bdf13f32050b91dfbe438413ba03 WatchSource:0}: Error finding container 443c021fd5387e92e529476b64b9fa3e3a70bdf13f32050b91dfbe438413ba03: Status 404 returned error can't find the container with id 443c021fd5387e92e529476b64b9fa3e3a70bdf13f32050b91dfbe438413ba03 Apr 17 16:20:15.188191 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.188176 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:20:15.214441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.214392 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" event={"ID":"4fdc147543c8fe25f8dcbfbdfb16786d","Type":"ContainerStarted","Data":"8fc229ae66ceac653e7086b3c6bc55e9632bf00318a425e098a45a6c94e7a8a9"} Apr 17 16:20:15.215197 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.215177 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" event={"ID":"06511d0037f371d29f77ad6b941c9dbf","Type":"ContainerStarted","Data":"443c021fd5387e92e529476b64b9fa3e3a70bdf13f32050b91dfbe438413ba03"} Apr 17 16:20:15.275397 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:15.275355 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:15.321294 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.321270 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:20:15.376499 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:15.376461 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:15.476889 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:15.476840 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-214.ec2.internal\" not found" Apr 17 16:20:15.505579 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.505556 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:20:15.575644 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.575556 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" Apr 17 16:20:15.588817 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.588784 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:20:15.589942 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.589926 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" Apr 17 16:20:15.595544 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:15.595524 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:20:16.050362 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.050275 2569 apiserver.go:52] "Watching apiserver" Apr 17 16:20:16.057690 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.057656 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:20:16.059659 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.059619 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8vckd","openshift-multus/multus-additional-cni-plugins-vfhjq","openshift-network-diagnostics/network-check-target-hs4nv","kube-system/konnectivity-agent-pr4pk","openshift-image-registry/node-ca-fptfr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal","openshift-multus/multus-jswsr","openshift-multus/network-metrics-daemon-tfgvs","openshift-network-operator/iptables-alerter-4qbdr","openshift-ovn-kubernetes/ovnkube-node-fpc4t","kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg","openshift-cluster-node-tuning-operator/tuned-2cqbs"] Apr 17 16:20:16.061757 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.061633 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.063707 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.063675 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.064526 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.064272 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:20:16.064526 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.064300 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:20:16.064526 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.064329 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:20:16.064526 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.064280 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:20:16.065272 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.064896 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vh9w7\"" Apr 17 16:20:16.065786 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.065760 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:16.065894 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.065835 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:16.066389 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.066368 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-np6qb\"" Apr 17 16:20:16.066492 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.066391 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:20:16.066492 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.066391 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:20:16.069327 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.069308 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:16.069426 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.069389 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.071846 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.071293 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.071846 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.071728 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:20:16.071846 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.071753 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wdh86\"" Apr 17 16:20:16.072925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.071755 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:20:16.072925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.071909 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:20:16.072925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.071967 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jx9nj\"" Apr 17 16:20:16.072925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.072088 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:20:16.072925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.072133 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:20:16.073613 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.073542 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:20:16.073818 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.073800 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:20:16.074109 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.074059 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vl8tq\"" Apr 17 16:20:16.075482 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.075464 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:16.075575 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.075542 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:16.075672 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.075657 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.078045 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.077909 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.078130 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.078062 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:20:16.078443 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.078332 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:20:16.078443 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.078386 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:20:16.078618 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.078596 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dwqz4\"" Apr 17 16:20:16.080515 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.080067 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.080686 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.080663 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:20:16.080763 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.080701 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:20:16.081444 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.081425 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:20:16.081547 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.081529 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tzvxd\"" Apr 17 16:20:16.082383 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.082366 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.082480 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.082408 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:20:16.082599 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.082570 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:20:16.082743 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.082713 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:20:16.084518 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.084494 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h7pmc\"" Apr 17 16:20:16.084776 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.084754 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:20:16.085825 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.085425 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-q95ft\"" Apr 17 16:20:16.085825 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.085522 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:20:16.085825 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.085559 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:20:16.086439 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.086282 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:20:16.087814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.087242 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:20:16.088189 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-node-log\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.088189 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088154 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-etc-kubernetes\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.088351 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088242 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-kubelet\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.088351 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088320 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-log-socket\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.088453 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088356 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm4p4\" (UniqueName: \"kubernetes.io/projected/fb03560e-c45d-4041-b046-c5c9b2fd22a8-kube-api-access-cm4p4\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.088453 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088422 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-cnibin\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.088551 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.088551 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088492 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d4f3209-9daa-4cca-9236-5918fad01d8d-agent-certs\") pod \"konnectivity-agent-pr4pk\" (UID: \"3d4f3209-9daa-4cca-9236-5918fad01d8d\") " pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:16.088551 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-multus-certs\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.088693 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-system-cni-dir\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.088693 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088591 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-cni-multus\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.088693 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xb57\" (UniqueName: \"kubernetes.io/projected/9a537c40-6a2e-4250-8d81-dfa908f4f536-kube-api-access-4xb57\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.088693 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3237eb23-86ef-44a2-98cb-f37d4d9fb915-host\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.088693 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088683 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.088976 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088716 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d4f3209-9daa-4cca-9236-5918fad01d8d-konnectivity-ca\") pod \"konnectivity-agent-pr4pk\" (UID: \"3d4f3209-9daa-4cca-9236-5918fad01d8d\") " pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:16.088976 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088774 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-cni-bin\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.088976 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088884 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-daemon-config\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.088976 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-ovn\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.088976 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.088955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-cnibin\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089044 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-k8s-cni-cncf-io\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089085 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/919c2101-3bb9-439c-89fe-f84487ea8e6d-tmp-dir\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089112 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdrq4\" (UniqueName: \"kubernetes.io/projected/919c2101-3bb9-439c-89fe-f84487ea8e6d-kube-api-access-rdrq4\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089156 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-cni-bin\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-os-release\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089217 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089261 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-cni-netd\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089285 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-os-release\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.089332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089314 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsxzn\" (UniqueName: \"kubernetes.io/projected/3237eb23-86ef-44a2-98cb-f37d4d9fb915-kube-api-access-tsxzn\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089344 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ac3b0199-072a-4b90-a39a-d710ba4581ae-iptables-alerter-script\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-hostroot\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-var-lib-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089434 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089469 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-env-overrides\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-netns\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-kubelet\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/919c2101-3bb9-439c-89fe-f84487ea8e6d-hosts-file\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089697 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-systemd-units\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.089748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-system-cni-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089849 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-conf-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089887 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-slash\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089922 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-run-netns\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.089973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-socket-dir-parent\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090040 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-862sw\" (UniqueName: \"kubernetes.io/projected/b74a4398-a3fb-40e5-b014-d968d4c10069-kube-api-access-862sw\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac3b0199-072a-4b90-a39a-d710ba4581ae-host-slash\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090138 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vszh\" (UniqueName: \"kubernetes.io/projected/3dfa7029-ad7c-4849-aaf2-9516b86babac-kube-api-access-6vszh\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090175 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-cni-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.090262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090204 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmml6\" (UniqueName: \"kubernetes.io/projected/ac3b0199-072a-4b90-a39a-d710ba4581ae-kube-api-access-rmml6\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090275 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovnkube-script-lib\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090313 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090390 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3237eb23-86ef-44a2-98cb-f37d4d9fb915-serviceca\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090428 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-systemd\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovnkube-config\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090485 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovn-node-metrics-cert\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090518 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-cni-binary-copy\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090569 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a537c40-6a2e-4250-8d81-dfa908f4f536-cni-binary-copy\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090602 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:16.090735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.090644 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-etc-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.114048 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.113883 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:15:15 +0000 UTC" deadline="2027-12-29 22:24:38.808860396 +0000 UTC" Apr 17 16:20:16.114048 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.113910 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14910h4m22.694952891s" Apr 17 16:20:16.176853 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.176824 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:20:16.191628 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191602 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-var-lib-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.191779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.191779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191673 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-env-overrides\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.191779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-netns\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.191779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191714 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-kubelet\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.191779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-var-lib-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.191779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/919c2101-3bb9-439c-89fe-f84487ea8e6d-hosts-file\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191796 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191798 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-socket-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191817 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/919c2101-3bb9-439c-89fe-f84487ea8e6d-hosts-file\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-systemd-units\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191840 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-netns\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191855 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-system-cni-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191860 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-kubelet\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191889 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-systemd-units\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191880 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-conf-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-system-cni-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-host\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191957 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-slash\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-run-netns\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.191930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-conf-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-slash\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192078 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-run-netns\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-socket-dir-parent\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-862sw\" (UniqueName: \"kubernetes.io/projected/b74a4398-a3fb-40e5-b014-d968d4c10069-kube-api-access-862sw\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac3b0199-072a-4b90-a39a-d710ba4581ae-host-slash\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vszh\" (UniqueName: \"kubernetes.io/projected/3dfa7029-ad7c-4849-aaf2-9516b86babac-kube-api-access-6vszh\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192191 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-socket-dir-parent\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac3b0199-072a-4b90-a39a-d710ba4581ae-host-slash\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192278 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-env-overrides\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-cni-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmml6\" (UniqueName: \"kubernetes.io/projected/ac3b0199-072a-4b90-a39a-d710ba4581ae-kube-api-access-rmml6\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovnkube-script-lib\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192460 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-cni-dir\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3237eb23-86ef-44a2-98cb-f37d4d9fb915-serviceca\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192556 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-systemd\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovnkube-config\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.192812 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192586 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovn-node-metrics-cert\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-cni-binary-copy\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a537c40-6a2e-4250-8d81-dfa908f4f536-cni-binary-copy\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysconfig\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192744 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysctl-d\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192916 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovnkube-script-lib\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.192968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-systemd\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193063 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovnkube-config\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193112 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3237eb23-86ef-44a2-98cb-f37d4d9fb915-serviceca\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193162 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193185 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193247 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-etc-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.193337 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.193414 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs podName:b74a4398-a3fb-40e5-b014-d968d4c10069 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:16.69338306 +0000 UTC m=+3.074011402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs") pod "network-metrics-daemon-tfgvs" (UID: "b74a4398-a3fb-40e5-b014-d968d4c10069") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193486 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-cni-binary-copy\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-node-log\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.193552 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-etc-kubernetes\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193561 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-sys-fs\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193607 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-etc-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193607 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-node-log\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-etc-kubernetes\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193674 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-systemd\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-var-lib-kubelet\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193726 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-kubelet\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-log-socket\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm4p4\" (UniqueName: \"kubernetes.io/projected/fb03560e-c45d-4041-b046-c5c9b2fd22a8-kube-api-access-cm4p4\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193778 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-kubelet\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-cnibin\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193825 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-log-socket\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193852 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d4f3209-9daa-4cca-9236-5918fad01d8d-agent-certs\") pod \"konnectivity-agent-pr4pk\" (UID: \"3d4f3209-9daa-4cca-9236-5918fad01d8d\") " pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-multus-certs\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.194203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193940 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-device-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193962 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-sys\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-cnibin\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.193987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-system-cni-dir\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-cni-multus\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-system-cni-dir\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-multus-certs\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194067 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xb57\" (UniqueName: \"kubernetes.io/projected/9a537c40-6a2e-4250-8d81-dfa908f4f536-kube-api-access-4xb57\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194106 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3237eb23-86ef-44a2-98cb-f37d4d9fb915-host\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194169 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d4f3209-9daa-4cca-9236-5918fad01d8d-konnectivity-ca\") pod \"konnectivity-agent-pr4pk\" (UID: \"3d4f3209-9daa-4cca-9236-5918fad01d8d\") " pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194176 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3237eb23-86ef-44a2-98cb-f37d4d9fb915-host\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-cni-bin\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194217 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4ml6\" (UniqueName: \"kubernetes.io/projected/c386e0eb-4858-47af-8e4e-d86c1667d4a6-kube-api-access-n4ml6\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g7k2\" (UniqueName: \"kubernetes.io/projected/109d48c7-fa97-439a-a4f8-f4753386ffa6-kube-api-access-6g7k2\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194316 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-cni-multus\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.194860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3dfa7029-ad7c-4849-aaf2-9516b86babac-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-daemon-config\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194392 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194421 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-tuned\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194393 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-var-lib-cni-bin\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/109d48c7-fa97-439a-a4f8-f4753386ffa6-tmp\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194474 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-ovn\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-cnibin\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194527 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-k8s-cni-cncf-io\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194536 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-ovn\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/919c2101-3bb9-439c-89fe-f84487ea8e6d-tmp-dir\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194574 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-cnibin\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194577 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdrq4\" (UniqueName: \"kubernetes.io/projected/919c2101-3bb9-439c-89fe-f84487ea8e6d-kube-api-access-rdrq4\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-etc-selinux\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194645 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-host-run-k8s-cni-cncf-io\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194670 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-cni-bin\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-os-release\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.195660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194747 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-modprobe-d\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194772 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-kubernetes\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-run\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194814 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a537c40-6a2e-4250-8d81-dfa908f4f536-cni-binary-copy\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194848 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-cni-netd\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/919c2101-3bb9-439c-89fe-f84487ea8e6d-tmp-dir\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-os-release\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194891 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3dfa7029-ad7c-4849-aaf2-9516b86babac-os-release\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194897 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsxzn\" (UniqueName: \"kubernetes.io/projected/3237eb23-86ef-44a2-98cb-f37d4d9fb915-kube-api-access-tsxzn\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194924 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysctl-conf\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194931 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-run-openvswitch\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194952 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ac3b0199-072a-4b90-a39a-d710ba4581ae-iptables-alerter-script\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.194978 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-hostroot\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195004 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-registration-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-lib-modules\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195085 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-cni-netd\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195121 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-hostroot\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.196359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195179 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a537c40-6a2e-4250-8d81-dfa908f4f536-os-release\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.197073 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195241 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb03560e-c45d-4041-b046-c5c9b2fd22a8-host-cni-bin\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.197073 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195576 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ac3b0199-072a-4b90-a39a-d710ba4581ae-iptables-alerter-script\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.197073 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195655 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a537c40-6a2e-4250-8d81-dfa908f4f536-multus-daemon-config\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.197073 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.195950 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d4f3209-9daa-4cca-9236-5918fad01d8d-konnectivity-ca\") pod \"konnectivity-agent-pr4pk\" (UID: \"3d4f3209-9daa-4cca-9236-5918fad01d8d\") " pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:16.197073 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.196904 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb03560e-c45d-4041-b046-c5c9b2fd22a8-ovn-node-metrics-cert\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.197073 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.197052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d4f3209-9daa-4cca-9236-5918fad01d8d-agent-certs\") pod \"konnectivity-agent-pr4pk\" (UID: \"3d4f3209-9daa-4cca-9236-5918fad01d8d\") " pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:16.200334 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.200312 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:16.200334 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.200336 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:16.200473 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.200346 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fgzmq for pod openshift-network-diagnostics/network-check-target-hs4nv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:16.200473 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.200394 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq podName:184a3c91-ad85-4fab-a0ca-a98c92acda61 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:16.70038023 +0000 UTC m=+3.081008570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fgzmq" (UniqueName: "kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq") pod "network-check-target-hs4nv" (UID: "184a3c91-ad85-4fab-a0ca-a98c92acda61") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:16.202666 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.202593 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmml6\" (UniqueName: \"kubernetes.io/projected/ac3b0199-072a-4b90-a39a-d710ba4581ae-kube-api-access-rmml6\") pod \"iptables-alerter-4qbdr\" (UID: \"ac3b0199-072a-4b90-a39a-d710ba4581ae\") " pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.204318 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.204284 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vszh\" (UniqueName: \"kubernetes.io/projected/3dfa7029-ad7c-4849-aaf2-9516b86babac-kube-api-access-6vszh\") pod \"multus-additional-cni-plugins-vfhjq\" (UID: \"3dfa7029-ad7c-4849-aaf2-9516b86babac\") " pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.205157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.205088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-862sw\" (UniqueName: \"kubernetes.io/projected/b74a4398-a3fb-40e5-b014-d968d4c10069-kube-api-access-862sw\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:16.205157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.205138 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsxzn\" (UniqueName: \"kubernetes.io/projected/3237eb23-86ef-44a2-98cb-f37d4d9fb915-kube-api-access-tsxzn\") pod \"node-ca-fptfr\" (UID: \"3237eb23-86ef-44a2-98cb-f37d4d9fb915\") " pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.205555 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.205184 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdrq4\" (UniqueName: \"kubernetes.io/projected/919c2101-3bb9-439c-89fe-f84487ea8e6d-kube-api-access-rdrq4\") pod \"node-resolver-8vckd\" (UID: \"919c2101-3bb9-439c-89fe-f84487ea8e6d\") " pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.206030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.206010 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xb57\" (UniqueName: \"kubernetes.io/projected/9a537c40-6a2e-4250-8d81-dfa908f4f536-kube-api-access-4xb57\") pod \"multus-jswsr\" (UID: \"9a537c40-6a2e-4250-8d81-dfa908f4f536\") " pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.208468 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.208449 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm4p4\" (UniqueName: \"kubernetes.io/projected/fb03560e-c45d-4041-b046-c5c9b2fd22a8-kube-api-access-cm4p4\") pod \"ovnkube-node-fpc4t\" (UID: \"fb03560e-c45d-4041-b046-c5c9b2fd22a8\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.296341 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296246 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-sys-fs\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.296341 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-systemd\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296341 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-sys-fs\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.296341 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-var-lib-kubelet\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-device-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296383 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-var-lib-kubelet\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296376 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-systemd\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296418 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-sys\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4ml6\" (UniqueName: \"kubernetes.io/projected/c386e0eb-4858-47af-8e4e-d86c1667d4a6-kube-api-access-n4ml6\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296450 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-device-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296479 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g7k2\" (UniqueName: \"kubernetes.io/projected/109d48c7-fa97-439a-a4f8-f4753386ffa6-kube-api-access-6g7k2\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296487 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-sys\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-tuned\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296550 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/109d48c7-fa97-439a-a4f8-f4753386ffa6-tmp\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-etc-selinux\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-modprobe-d\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296624 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-kubernetes\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.296657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-run\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296672 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysctl-conf\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296693 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-registration-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-lib-modules\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-socket-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296741 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-host\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296787 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-run\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysconfig\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296815 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysctl-d\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-modprobe-d\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296897 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysconfig\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-socket-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296942 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-kubernetes\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-host\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296964 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysctl-d\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.296986 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-registration-dir\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.297384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.297004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c386e0eb-4858-47af-8e4e-d86c1667d4a6-etc-selinux\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.298198 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.297059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-sysctl-conf\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.298198 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.297091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/109d48c7-fa97-439a-a4f8-f4753386ffa6-lib-modules\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.299047 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.299025 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/109d48c7-fa97-439a-a4f8-f4753386ffa6-etc-tuned\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.299153 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.299053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/109d48c7-fa97-439a-a4f8-f4753386ffa6-tmp\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.304539 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.304457 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g7k2\" (UniqueName: \"kubernetes.io/projected/109d48c7-fa97-439a-a4f8-f4753386ffa6-kube-api-access-6g7k2\") pod \"tuned-2cqbs\" (UID: \"109d48c7-fa97-439a-a4f8-f4753386ffa6\") " pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.304539 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.304473 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4ml6\" (UniqueName: \"kubernetes.io/projected/c386e0eb-4858-47af-8e4e-d86c1667d4a6-kube-api-access-n4ml6\") pod \"aws-ebs-csi-driver-node-22ngg\" (UID: \"c386e0eb-4858-47af-8e4e-d86c1667d4a6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.331766 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.331745 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:20:16.375653 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.375620 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jswsr" Apr 17 16:20:16.382958 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.382928 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a537c40_6a2e_4250_8d81_dfa908f4f536.slice/crio-8677ade6ef2fc2cde840813fabea0f2c8ac149cb922290b0ad186dd6de112c59 WatchSource:0}: Error finding container 8677ade6ef2fc2cde840813fabea0f2c8ac149cb922290b0ad186dd6de112c59: Status 404 returned error can't find the container with id 8677ade6ef2fc2cde840813fabea0f2c8ac149cb922290b0ad186dd6de112c59 Apr 17 16:20:16.394336 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.394314 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" Apr 17 16:20:16.401304 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.401273 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dfa7029_ad7c_4849_aaf2_9516b86babac.slice/crio-b86c051d493883956aa9497d6b24d8dd83eb58483db88e88dfe8d1c42d61a8ea WatchSource:0}: Error finding container b86c051d493883956aa9497d6b24d8dd83eb58483db88e88dfe8d1c42d61a8ea: Status 404 returned error can't find the container with id b86c051d493883956aa9497d6b24d8dd83eb58483db88e88dfe8d1c42d61a8ea Apr 17 16:20:16.403347 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.403328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:16.410002 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.409974 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fptfr" Apr 17 16:20:16.410303 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.410267 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d4f3209_9daa_4cca_9236_5918fad01d8d.slice/crio-78619686e309a9f572f339aa7737275351826b250568d31feca015509df8b181 WatchSource:0}: Error finding container 78619686e309a9f572f339aa7737275351826b250568d31feca015509df8b181: Status 404 returned error can't find the container with id 78619686e309a9f572f339aa7737275351826b250568d31feca015509df8b181 Apr 17 16:20:16.416522 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.416500 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8vckd" Apr 17 16:20:16.418662 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.418633 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3237eb23_86ef_44a2_98cb_f37d4d9fb915.slice/crio-80eb0dc501a34730a31124a2470a60845803c60c75750811091dd7253a5c4229 WatchSource:0}: Error finding container 80eb0dc501a34730a31124a2470a60845803c60c75750811091dd7253a5c4229: Status 404 returned error can't find the container with id 80eb0dc501a34730a31124a2470a60845803c60c75750811091dd7253a5c4229 Apr 17 16:20:16.424458 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.424419 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod919c2101_3bb9_439c_89fe_f84487ea8e6d.slice/crio-b15201f96d4993f72e1a834106686d24377c06d3e786ae084d3b5b90f4a80f86 WatchSource:0}: Error finding container b15201f96d4993f72e1a834106686d24377c06d3e786ae084d3b5b90f4a80f86: Status 404 returned error can't find the container with id b15201f96d4993f72e1a834106686d24377c06d3e786ae084d3b5b90f4a80f86 Apr 17 16:20:16.425855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.425645 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4qbdr" Apr 17 16:20:16.433462 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.433441 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:16.433630 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.433605 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3b0199_072a_4b90_a39a_d710ba4581ae.slice/crio-aee602dc3b7c133d25101f5f6744aea35e8e89516ad261817cefa13b4a6cc865 WatchSource:0}: Error finding container aee602dc3b7c133d25101f5f6744aea35e8e89516ad261817cefa13b4a6cc865: Status 404 returned error can't find the container with id aee602dc3b7c133d25101f5f6744aea35e8e89516ad261817cefa13b4a6cc865 Apr 17 16:20:16.441563 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.441541 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" Apr 17 16:20:16.441964 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.441947 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb03560e_c45d_4041_b046_c5c9b2fd22a8.slice/crio-9cad17e6c48b13fe617e0cfe13dab1898ff7af480cb9ece964be3fb2100c6f6f WatchSource:0}: Error finding container 9cad17e6c48b13fe617e0cfe13dab1898ff7af480cb9ece964be3fb2100c6f6f: Status 404 returned error can't find the container with id 9cad17e6c48b13fe617e0cfe13dab1898ff7af480cb9ece964be3fb2100c6f6f Apr 17 16:20:16.447157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.447136 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" Apr 17 16:20:16.447273 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.447196 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc386e0eb_4858_47af_8e4e_d86c1667d4a6.slice/crio-8e9e533c2b61cdd166949c7b2183b779aead1774a75218556e3bd1fee6e61f67 WatchSource:0}: Error finding container 8e9e533c2b61cdd166949c7b2183b779aead1774a75218556e3bd1fee6e61f67: Status 404 returned error can't find the container with id 8e9e533c2b61cdd166949c7b2183b779aead1774a75218556e3bd1fee6e61f67 Apr 17 16:20:16.453760 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:20:16.453733 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod109d48c7_fa97_439a_a4f8_f4753386ffa6.slice/crio-048785596efe2bb051a965056d7b3add55c3b1c64ba40b02c9f07b8022c743fe WatchSource:0}: Error finding container 048785596efe2bb051a965056d7b3add55c3b1c64ba40b02c9f07b8022c743fe: Status 404 returned error can't find the container with id 048785596efe2bb051a965056d7b3add55c3b1c64ba40b02c9f07b8022c743fe Apr 17 16:20:16.505094 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.505067 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:20:16.699976 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.699935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:16.700148 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.700087 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:16.700205 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.700164 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs podName:b74a4398-a3fb-40e5-b014-d968d4c10069 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:17.700143832 +0000 UTC m=+4.080772186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs") pod "network-metrics-daemon-tfgvs" (UID: "b74a4398-a3fb-40e5-b014-d968d4c10069") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:16.800526 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:16.800495 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:16.800657 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.800621 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:16.800657 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.800636 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:16.800657 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.800645 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fgzmq for pod openshift-network-diagnostics/network-check-target-hs4nv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:16.800781 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:16.800690 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq podName:184a3c91-ad85-4fab-a0ca-a98c92acda61 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:17.800678076 +0000 UTC m=+4.181306415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgzmq" (UniqueName: "kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq") pod "network-check-target-hs4nv" (UID: "184a3c91-ad85-4fab-a0ca-a98c92acda61") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:17.114562 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.114523 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:15:15 +0000 UTC" deadline="2027-09-14 12:26:41.551826732 +0000 UTC" Apr 17 16:20:17.114562 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.114562 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12356h6m24.437268779s" Apr 17 16:20:17.212617 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.212436 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:17.212617 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:17.212552 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:17.212617 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.212587 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:17.214159 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:17.214100 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:17.221919 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.221283 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" event={"ID":"06511d0037f371d29f77ad6b941c9dbf","Type":"ContainerStarted","Data":"f9a1ef218975a0f315ee6b4cb44a6343d80425b5ea89133bca12f839ede78778"} Apr 17 16:20:17.225351 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.225286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" event={"ID":"109d48c7-fa97-439a-a4f8-f4753386ffa6","Type":"ContainerStarted","Data":"048785596efe2bb051a965056d7b3add55c3b1c64ba40b02c9f07b8022c743fe"} Apr 17 16:20:17.227421 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.227369 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"9cad17e6c48b13fe617e0cfe13dab1898ff7af480cb9ece964be3fb2100c6f6f"} Apr 17 16:20:17.231447 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.231328 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4qbdr" event={"ID":"ac3b0199-072a-4b90-a39a-d710ba4581ae","Type":"ContainerStarted","Data":"aee602dc3b7c133d25101f5f6744aea35e8e89516ad261817cefa13b4a6cc865"} Apr 17 16:20:17.238945 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.238867 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fptfr" event={"ID":"3237eb23-86ef-44a2-98cb-f37d4d9fb915","Type":"ContainerStarted","Data":"80eb0dc501a34730a31124a2470a60845803c60c75750811091dd7253a5c4229"} Apr 17 16:20:17.243017 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.242989 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jswsr" event={"ID":"9a537c40-6a2e-4250-8d81-dfa908f4f536","Type":"ContainerStarted","Data":"8677ade6ef2fc2cde840813fabea0f2c8ac149cb922290b0ad186dd6de112c59"} Apr 17 16:20:17.247739 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.247495 2569 generic.go:358] "Generic (PLEG): container finished" podID="4fdc147543c8fe25f8dcbfbdfb16786d" containerID="77753db9431033962161cddb0be44722b0d1f5aabf5e6cd1db4702f576517ff2" exitCode=0 Apr 17 16:20:17.247739 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.247670 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" event={"ID":"4fdc147543c8fe25f8dcbfbdfb16786d","Type":"ContainerDied","Data":"77753db9431033962161cddb0be44722b0d1f5aabf5e6cd1db4702f576517ff2"} Apr 17 16:20:17.254991 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.254613 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" event={"ID":"c386e0eb-4858-47af-8e4e-d86c1667d4a6","Type":"ContainerStarted","Data":"8e9e533c2b61cdd166949c7b2183b779aead1774a75218556e3bd1fee6e61f67"} Apr 17 16:20:17.262428 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.261922 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-214.ec2.internal" podStartSLOduration=2.261902709 podStartE2EDuration="2.261902709s" podCreationTimestamp="2026-04-17 16:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:20:17.235011255 +0000 UTC m=+3.615639619" watchObservedRunningTime="2026-04-17 16:20:17.261902709 +0000 UTC m=+3.642531071" Apr 17 16:20:17.262428 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.262383 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8vckd" event={"ID":"919c2101-3bb9-439c-89fe-f84487ea8e6d","Type":"ContainerStarted","Data":"b15201f96d4993f72e1a834106686d24377c06d3e786ae084d3b5b90f4a80f86"} Apr 17 16:20:17.263608 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.263530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pr4pk" event={"ID":"3d4f3209-9daa-4cca-9236-5918fad01d8d","Type":"ContainerStarted","Data":"78619686e309a9f572f339aa7737275351826b250568d31feca015509df8b181"} Apr 17 16:20:17.264757 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.264713 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" event={"ID":"3dfa7029-ad7c-4849-aaf2-9516b86babac","Type":"ContainerStarted","Data":"b86c051d493883956aa9497d6b24d8dd83eb58483db88e88dfe8d1c42d61a8ea"} Apr 17 16:20:17.707607 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.707568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:17.707785 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:17.707743 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:17.707844 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:17.707808 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs podName:b74a4398-a3fb-40e5-b014-d968d4c10069 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:19.707788308 +0000 UTC m=+6.088416661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs") pod "network-metrics-daemon-tfgvs" (UID: "b74a4398-a3fb-40e5-b014-d968d4c10069") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:17.808287 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:17.808190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:17.808462 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:17.808416 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:17.808462 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:17.808437 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:17.808462 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:17.808450 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fgzmq for pod openshift-network-diagnostics/network-check-target-hs4nv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:17.808619 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:17.808522 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq podName:184a3c91-ad85-4fab-a0ca-a98c92acda61 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:19.808503677 +0000 UTC m=+6.189132018 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgzmq" (UniqueName: "kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq") pod "network-check-target-hs4nv" (UID: "184a3c91-ad85-4fab-a0ca-a98c92acda61") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:18.293988 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:18.293922 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" event={"ID":"4fdc147543c8fe25f8dcbfbdfb16786d","Type":"ContainerStarted","Data":"6159029c60648b8f41446a5aa1b93696dd6631fc2569e812212e19136fa56b70"} Apr 17 16:20:19.211867 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:19.211832 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:19.212057 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:19.211972 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:19.212401 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:19.212378 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:19.212512 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:19.212475 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:19.721745 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:19.721707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:19.722192 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:19.721871 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:19.722192 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:19.721930 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs podName:b74a4398-a3fb-40e5-b014-d968d4c10069 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:23.721910149 +0000 UTC m=+10.102538509 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs") pod "network-metrics-daemon-tfgvs" (UID: "b74a4398-a3fb-40e5-b014-d968d4c10069") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:19.822517 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:19.822483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:19.822701 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:19.822675 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:19.822780 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:19.822701 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:19.822780 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:19.822715 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fgzmq for pod openshift-network-diagnostics/network-check-target-hs4nv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:19.822886 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:19.822781 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq podName:184a3c91-ad85-4fab-a0ca-a98c92acda61 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:23.822762248 +0000 UTC m=+10.203390591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgzmq" (UniqueName: "kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq") pod "network-check-target-hs4nv" (UID: "184a3c91-ad85-4fab-a0ca-a98c92acda61") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:21.212432 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:21.212401 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:21.212901 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:21.212406 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:21.212901 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:21.212536 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:21.212901 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:21.212643 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:23.212790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:23.212555 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:23.212790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:23.212613 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:23.212790 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:23.212684 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:23.212790 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:23.212744 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:23.752688 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:23.752652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:23.752875 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:23.752801 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:23.752875 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:23.752867 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs podName:b74a4398-a3fb-40e5-b014-d968d4c10069 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:31.752847923 +0000 UTC m=+18.133476277 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs") pod "network-metrics-daemon-tfgvs" (UID: "b74a4398-a3fb-40e5-b014-d968d4c10069") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:23.853627 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:23.853550 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:23.853818 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:23.853749 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:23.853818 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:23.853768 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:23.853818 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:23.853780 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fgzmq for pod openshift-network-diagnostics/network-check-target-hs4nv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:23.853983 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:23.853840 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq podName:184a3c91-ad85-4fab-a0ca-a98c92acda61 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:31.853820055 +0000 UTC m=+18.234448398 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgzmq" (UniqueName: "kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq") pod "network-check-target-hs4nv" (UID: "184a3c91-ad85-4fab-a0ca-a98c92acda61") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:25.211705 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:25.211675 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:25.212171 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:25.211797 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:25.212171 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:25.212095 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:25.212356 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:25.212206 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:26.314405 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.314153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" event={"ID":"109d48c7-fa97-439a-a4f8-f4753386ffa6","Type":"ContainerStarted","Data":"1aae436e5c8ea89f55a1bdfd1b5ae5c8fd3b74498868e87709d297f5243c9891"} Apr 17 16:20:26.317729 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.317666 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fptfr" event={"ID":"3237eb23-86ef-44a2-98cb-f37d4d9fb915","Type":"ContainerStarted","Data":"797baf68ca129d25e637b9eb7cc245762de375dafb3d0b577656992326362bec"} Apr 17 16:20:26.319804 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.319769 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" event={"ID":"c386e0eb-4858-47af-8e4e-d86c1667d4a6","Type":"ContainerStarted","Data":"1dd173448cfc3af8a3db0794ecffca0546f63b523b9ef2b00eedc77124bf5f89"} Apr 17 16:20:26.322449 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.322413 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8vckd" event={"ID":"919c2101-3bb9-439c-89fe-f84487ea8e6d","Type":"ContainerStarted","Data":"1e8edccef6be13c60f8a77ed3ecd365232cf515bec45004210a4d8804a85c8da"} Apr 17 16:20:26.324589 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.324548 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pr4pk" event={"ID":"3d4f3209-9daa-4cca-9236-5918fad01d8d","Type":"ContainerStarted","Data":"7a0952d35bbe79b9490ef847d17051eca544f2489ed170887badaed4d68ffa73"} Apr 17 16:20:26.326895 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.326437 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfa7029-ad7c-4849-aaf2-9516b86babac" containerID="d16fb4ff414997e5bbfa28e91bde162b2c7881094f3c8e5580663e7984206a1a" exitCode=0 Apr 17 16:20:26.326895 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.326476 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" event={"ID":"3dfa7029-ad7c-4849-aaf2-9516b86babac","Type":"ContainerDied","Data":"d16fb4ff414997e5bbfa28e91bde162b2c7881094f3c8e5580663e7984206a1a"} Apr 17 16:20:26.338898 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.338583 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-214.ec2.internal" podStartSLOduration=11.338559461 podStartE2EDuration="11.338559461s" podCreationTimestamp="2026-04-17 16:20:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:20:18.309375937 +0000 UTC m=+4.690004299" watchObservedRunningTime="2026-04-17 16:20:26.338559461 +0000 UTC m=+12.719187822" Apr 17 16:20:26.339167 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.339122 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2cqbs" podStartSLOduration=3.410905755 podStartE2EDuration="12.339113573s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.455270913 +0000 UTC m=+2.835899252" lastFinishedPulling="2026-04-17 16:20:25.383478719 +0000 UTC m=+11.764107070" observedRunningTime="2026-04-17 16:20:26.338444459 +0000 UTC m=+12.719072822" watchObservedRunningTime="2026-04-17 16:20:26.339113573 +0000 UTC m=+12.719741934" Apr 17 16:20:26.389586 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.389289 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8vckd" podStartSLOduration=3.432408675 podStartE2EDuration="12.389273295s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.426391031 +0000 UTC m=+2.807019373" lastFinishedPulling="2026-04-17 16:20:25.38325564 +0000 UTC m=+11.763883993" observedRunningTime="2026-04-17 16:20:26.388508915 +0000 UTC m=+12.769137278" watchObservedRunningTime="2026-04-17 16:20:26.389273295 +0000 UTC m=+12.769901655" Apr 17 16:20:26.407713 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.407472 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pr4pk" podStartSLOduration=3.439416781 podStartE2EDuration="12.407450262s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.414464536 +0000 UTC m=+2.795092875" lastFinishedPulling="2026-04-17 16:20:25.382498 +0000 UTC m=+11.763126356" observedRunningTime="2026-04-17 16:20:26.407001358 +0000 UTC m=+12.787629720" watchObservedRunningTime="2026-04-17 16:20:26.407450262 +0000 UTC m=+12.788078624" Apr 17 16:20:26.431284 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:26.431207 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fptfr" podStartSLOduration=3.469754373 podStartE2EDuration="12.431188278s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.421121776 +0000 UTC m=+2.801750118" lastFinishedPulling="2026-04-17 16:20:25.382555678 +0000 UTC m=+11.763184023" observedRunningTime="2026-04-17 16:20:26.430773063 +0000 UTC m=+12.811401449" watchObservedRunningTime="2026-04-17 16:20:26.431188278 +0000 UTC m=+12.811816641" Apr 17 16:20:27.212151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:27.212116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:27.212332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:27.212116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:27.212332 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:27.212256 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:27.212448 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:27.212386 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:27.330044 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:27.330007 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4qbdr" event={"ID":"ac3b0199-072a-4b90-a39a-d710ba4581ae","Type":"ContainerStarted","Data":"47ffa8ef3e2501012d313b37c696909e5f9811f5eccecdaaf8d5ead0f0ac4ed6"} Apr 17 16:20:27.345341 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:27.345294 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4qbdr" podStartSLOduration=4.391078744 podStartE2EDuration="13.345279539s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.4365322 +0000 UTC m=+2.817160539" lastFinishedPulling="2026-04-17 16:20:25.390732991 +0000 UTC m=+11.771361334" observedRunningTime="2026-04-17 16:20:27.344984894 +0000 UTC m=+13.725613256" watchObservedRunningTime="2026-04-17 16:20:27.345279539 +0000 UTC m=+13.725907900" Apr 17 16:20:29.212718 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:29.212679 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:29.213190 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:29.212690 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:29.213190 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:29.212795 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:29.213190 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:29.212868 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:29.455318 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:29.455275 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:29.456346 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:29.456325 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:30.335748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:30.335718 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:30.336339 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:30.336319 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pr4pk" Apr 17 16:20:31.212258 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:31.212207 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:31.212430 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:31.212206 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:31.212430 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:31.212380 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:31.212430 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:31.212417 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:31.811382 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:31.811346 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:31.811767 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:31.811470 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:31.811767 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:31.811525 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs podName:b74a4398-a3fb-40e5-b014-d968d4c10069 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:47.811512102 +0000 UTC m=+34.192140441 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs") pod "network-metrics-daemon-tfgvs" (UID: "b74a4398-a3fb-40e5-b014-d968d4c10069") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:31.912467 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:31.912422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:31.912654 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:31.912590 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:31.912654 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:31.912614 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:31.912654 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:31.912627 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fgzmq for pod openshift-network-diagnostics/network-check-target-hs4nv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:31.912827 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:31.912691 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq podName:184a3c91-ad85-4fab-a0ca-a98c92acda61 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:47.912672724 +0000 UTC m=+34.293301088 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgzmq" (UniqueName: "kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq") pod "network-check-target-hs4nv" (UID: "184a3c91-ad85-4fab-a0ca-a98c92acda61") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:33.212614 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:33.212573 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:33.213077 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:33.212714 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:33.213077 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:33.212573 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:33.213077 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:33.212829 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:35.212042 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:35.212001 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:35.212441 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:35.212122 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:35.212441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:35.212177 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:35.212441 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:35.212285 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:36.981378 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:36.981150 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:20:37.156874 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.156772 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:20:36.981376082Z","UUID":"e8299d7a-77c8-48fe-b69a-05a633fcc889","Handler":null,"Name":"","Endpoint":""} Apr 17 16:20:37.159533 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.159511 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:20:37.159641 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.159542 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:20:37.211872 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.211849 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:37.211952 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.211892 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:37.212024 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:37.212001 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:37.212147 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:37.212125 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:37.350543 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.350457 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" event={"ID":"c386e0eb-4858-47af-8e4e-d86c1667d4a6","Type":"ContainerStarted","Data":"a2ce7e30f8a1e9fb26d636456227bbcfef0b797b1d46e6e99a8154f9b59ce6c0"} Apr 17 16:20:37.352111 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.352075 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfa7029-ad7c-4849-aaf2-9516b86babac" containerID="eddad0314d3926d1155554ecf1e85549fe44b31da59145ca307b9a78940b72d3" exitCode=0 Apr 17 16:20:37.352266 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.352168 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" event={"ID":"3dfa7029-ad7c-4849-aaf2-9516b86babac","Type":"ContainerDied","Data":"eddad0314d3926d1155554ecf1e85549fe44b31da59145ca307b9a78940b72d3"} Apr 17 16:20:37.354947 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.354817 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"1abd17ab8efa21cdef5f097571f0aade509868d2cf63e507815136b7000f087f"} Apr 17 16:20:37.354947 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.354857 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"2ac4a370add8938735bacb2a8219d361cce1e7ecf0aad881d89dc197c03fad71"} Apr 17 16:20:37.354947 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.354868 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"dc1c7ee1c1e8b6efd21cd36eaee12aa6d140ea96c9708cdd45a8c78432092811"} Apr 17 16:20:37.354947 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.354879 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"d4081220033e3d0eff661a3ca643b85a8709b6e96b0206fe70ca0f047148e68f"} Apr 17 16:20:37.354947 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.354889 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"5fb1949f6efcdb232e6e20a65316cef692df0ac072f97e2c21122fb8ef32aecc"} Apr 17 16:20:37.354947 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.354898 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"ad7703c0d0f227a24434f17d14315c78ea47081ae561d63e48abde0d50da0b42"} Apr 17 16:20:37.356114 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.356095 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jswsr" event={"ID":"9a537c40-6a2e-4250-8d81-dfa908f4f536","Type":"ContainerStarted","Data":"44d4f5cd85f7d5cc0ae0a611bb6e7560059bec9a874ff2145c319844e74429ab"} Apr 17 16:20:37.390155 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:37.390103 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jswsr" podStartSLOduration=4.474290418 podStartE2EDuration="23.390088181s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.384754314 +0000 UTC m=+2.765382653" lastFinishedPulling="2026-04-17 16:20:35.300552063 +0000 UTC m=+21.681180416" observedRunningTime="2026-04-17 16:20:37.389952109 +0000 UTC m=+23.770580471" watchObservedRunningTime="2026-04-17 16:20:37.390088181 +0000 UTC m=+23.770716541" Apr 17 16:20:38.360049 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:38.360013 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" event={"ID":"c386e0eb-4858-47af-8e4e-d86c1667d4a6","Type":"ContainerStarted","Data":"1f3b3ca657f6fda2aff06c6e31165798148748ba41ef4bcbda479e77b9080a7a"} Apr 17 16:20:38.376696 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:38.376650 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-22ngg" podStartSLOduration=2.98586241 podStartE2EDuration="24.376635429s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.448878033 +0000 UTC m=+2.829506372" lastFinishedPulling="2026-04-17 16:20:37.839651053 +0000 UTC m=+24.220279391" observedRunningTime="2026-04-17 16:20:38.3766169 +0000 UTC m=+24.757245261" watchObservedRunningTime="2026-04-17 16:20:38.376635429 +0000 UTC m=+24.757263768" Apr 17 16:20:39.212422 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:39.212387 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:39.212596 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:39.212387 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:39.212596 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:39.212499 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:39.212596 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:39.212567 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:39.364278 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:39.364250 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfa7029-ad7c-4849-aaf2-9516b86babac" containerID="a752310c104c5bd67fedfc00a6900ebae37766581d452bbad80aa99711c01363" exitCode=0 Apr 17 16:20:39.364785 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:39.364335 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" event={"ID":"3dfa7029-ad7c-4849-aaf2-9516b86babac","Type":"ContainerDied","Data":"a752310c104c5bd67fedfc00a6900ebae37766581d452bbad80aa99711c01363"} Apr 17 16:20:40.368671 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:40.368637 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"a2fcf887afbe488bfe91a4e7887d721d9d4755fd62a1dec4b70bbbd6c4417408"} Apr 17 16:20:41.212365 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:41.212327 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:41.212534 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:41.212328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:41.212534 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:41.212425 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:41.212650 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:41.212536 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:41.372426 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:41.372390 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfa7029-ad7c-4849-aaf2-9516b86babac" containerID="931167243b6c3965fdf29ff1efa6f5e1960ba7e2e44749e7408e9b933e4fd0fe" exitCode=0 Apr 17 16:20:41.372941 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:41.372432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" event={"ID":"3dfa7029-ad7c-4849-aaf2-9516b86babac","Type":"ContainerDied","Data":"931167243b6c3965fdf29ff1efa6f5e1960ba7e2e44749e7408e9b933e4fd0fe"} Apr 17 16:20:42.380051 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:42.379823 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" event={"ID":"fb03560e-c45d-4041-b046-c5c9b2fd22a8","Type":"ContainerStarted","Data":"30fde1c5e4b70c09183fd4614be4bfc7dfb8e13f52f66c3e0ffbb699b115a7f8"} Apr 17 16:20:42.380503 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:42.380069 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:42.380503 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:42.380094 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:42.380503 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:42.380106 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:42.398642 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:42.398618 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:42.398794 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:42.398733 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:20:42.404722 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:42.404683 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" podStartSLOduration=9.558644341 podStartE2EDuration="28.40466982s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.444430746 +0000 UTC m=+2.825059086" lastFinishedPulling="2026-04-17 16:20:35.290456216 +0000 UTC m=+21.671084565" observedRunningTime="2026-04-17 16:20:42.404428306 +0000 UTC m=+28.785056667" watchObservedRunningTime="2026-04-17 16:20:42.40466982 +0000 UTC m=+28.785298180" Apr 17 16:20:43.212699 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:43.212665 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:43.212872 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:43.212666 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:43.212872 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:43.212817 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:43.212978 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:43.212868 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:43.533345 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:43.533262 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hs4nv"] Apr 17 16:20:43.533801 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:43.533404 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:43.533801 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:43.533507 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:43.535998 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:43.535955 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tfgvs"] Apr 17 16:20:43.536132 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:43.536090 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:43.536203 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:43.536173 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:45.212708 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:45.212673 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:45.213182 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:45.212678 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:45.213182 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:45.212792 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:45.213182 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:45.212863 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:47.211865 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:47.211824 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:47.212355 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:47.211824 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:47.212355 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:47.211953 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:47.212355 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:47.212099 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:47.830441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:47.830399 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:47.830669 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:47.830531 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:47.830669 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:47.830597 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs podName:b74a4398-a3fb-40e5-b014-d968d4c10069 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:19.830578353 +0000 UTC m=+66.211206694 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs") pod "network-metrics-daemon-tfgvs" (UID: "b74a4398-a3fb-40e5-b014-d968d4c10069") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:47.931188 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:47.931154 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:47.931378 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:47.931322 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:47.931378 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:47.931340 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:47.931378 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:47.931350 2569 projected.go:194] Error preparing data for projected volume kube-api-access-fgzmq for pod openshift-network-diagnostics/network-check-target-hs4nv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:47.931532 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:47.931397 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq podName:184a3c91-ad85-4fab-a0ca-a98c92acda61 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:19.931384452 +0000 UTC m=+66.312012791 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgzmq" (UniqueName: "kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq") pod "network-check-target-hs4nv" (UID: "184a3c91-ad85-4fab-a0ca-a98c92acda61") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:48.395199 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:48.395126 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfa7029-ad7c-4849-aaf2-9516b86babac" containerID="cce225e939203c7c6028cc4cd5609c80558c999de30323386b2beb1c7e425b5e" exitCode=0 Apr 17 16:20:48.395199 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:48.395186 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" event={"ID":"3dfa7029-ad7c-4849-aaf2-9516b86babac","Type":"ContainerDied","Data":"cce225e939203c7c6028cc4cd5609c80558c999de30323386b2beb1c7e425b5e"} Apr 17 16:20:49.212429 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.212407 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:49.212524 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.212407 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:49.212524 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:49.212507 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tfgvs" podUID="b74a4398-a3fb-40e5-b014-d968d4c10069" Apr 17 16:20:49.212613 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:49.212565 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hs4nv" podUID="184a3c91-ad85-4fab-a0ca-a98c92acda61" Apr 17 16:20:49.399403 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.399373 2569 generic.go:358] "Generic (PLEG): container finished" podID="3dfa7029-ad7c-4849-aaf2-9516b86babac" containerID="c0edfd61efafe2126f3e98850e1b0539672142df53676de0613136a1120bca68" exitCode=0 Apr 17 16:20:49.399862 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.399421 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" event={"ID":"3dfa7029-ad7c-4849-aaf2-9516b86babac","Type":"ContainerDied","Data":"c0edfd61efafe2126f3e98850e1b0539672142df53676de0613136a1120bca68"} Apr 17 16:20:49.405440 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.405417 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-214.ec2.internal" event="NodeReady" Apr 17 16:20:49.405542 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.405523 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:20:49.456910 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.456832 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fh45r"] Apr 17 16:20:49.484037 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.484002 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-h5d4m"] Apr 17 16:20:49.484212 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.484188 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.487616 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.487590 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:20:49.487760 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.487691 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:20:49.487832 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.487764 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vgznp\"" Apr 17 16:20:49.500198 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.499662 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fh45r"] Apr 17 16:20:49.500198 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.499697 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h5d4m"] Apr 17 16:20:49.500198 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.499815 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:49.502065 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.502000 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9rhb9\"" Apr 17 16:20:49.502065 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.502041 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:20:49.502428 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.502407 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:20:49.502509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.502473 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:20:49.644726 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.644688 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:49.644726 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.644733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-config-volume\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.644937 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.644754 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-tmp-dir\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.644937 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.644800 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6cb\" (UniqueName: \"kubernetes.io/projected/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-kube-api-access-lc6cb\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:49.644937 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.644820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94hn8\" (UniqueName: \"kubernetes.io/projected/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-kube-api-access-94hn8\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.644937 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.644845 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.745595 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.745514 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6cb\" (UniqueName: \"kubernetes.io/projected/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-kube-api-access-lc6cb\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:49.745595 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.745549 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94hn8\" (UniqueName: \"kubernetes.io/projected/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-kube-api-access-94hn8\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.745595 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.745571 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.745792 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:49.745665 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:49.745792 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:49.745725 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls podName:ccfa7faf-8272-48ab-b2fa-20b063c3b4ad nodeName:}" failed. No retries permitted until 2026-04-17 16:20:50.245706945 +0000 UTC m=+36.626335290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls") pod "dns-default-fh45r" (UID: "ccfa7faf-8272-48ab-b2fa-20b063c3b4ad") : secret "dns-default-metrics-tls" not found Apr 17 16:20:49.745792 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.745759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:49.745792 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.745788 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-config-volume\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.745932 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.745806 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-tmp-dir\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.745932 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:49.745878 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:49.745997 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:49.745933 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert podName:dce4e627-2afb-4861-8b1a-4bf531c0f4a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:50.245921172 +0000 UTC m=+36.626549511 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert") pod "ingress-canary-h5d4m" (UID: "dce4e627-2afb-4861-8b1a-4bf531c0f4a7") : secret "canary-serving-cert" not found Apr 17 16:20:49.746316 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.746297 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-tmp-dir\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.746473 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.746458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-config-volume\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.755322 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.755302 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94hn8\" (UniqueName: \"kubernetes.io/projected/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-kube-api-access-94hn8\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:49.755322 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:49.755316 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6cb\" (UniqueName: \"kubernetes.io/projected/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-kube-api-access-lc6cb\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:50.249895 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:50.249861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:50.250086 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:50.249921 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:50.250086 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:50.250006 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:50.250086 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:50.250013 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:50.250086 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:50.250069 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert podName:dce4e627-2afb-4861-8b1a-4bf531c0f4a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:51.250050233 +0000 UTC m=+37.630678578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert") pod "ingress-canary-h5d4m" (UID: "dce4e627-2afb-4861-8b1a-4bf531c0f4a7") : secret "canary-serving-cert" not found Apr 17 16:20:50.250086 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:50.250084 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls podName:ccfa7faf-8272-48ab-b2fa-20b063c3b4ad nodeName:}" failed. No retries permitted until 2026-04-17 16:20:51.25007604 +0000 UTC m=+37.630704380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls") pod "dns-default-fh45r" (UID: "ccfa7faf-8272-48ab-b2fa-20b063c3b4ad") : secret "dns-default-metrics-tls" not found Apr 17 16:20:50.404335 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:50.404299 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" event={"ID":"3dfa7029-ad7c-4849-aaf2-9516b86babac","Type":"ContainerStarted","Data":"dd4a38ec534895e5e51182cf11c1f7287eef67fdc7e71b88e9979f97ccb5f9a6"} Apr 17 16:20:50.426080 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:50.426037 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vfhjq" podStartSLOduration=4.856971829 podStartE2EDuration="36.42602421s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:16.403444971 +0000 UTC m=+2.784073310" lastFinishedPulling="2026-04-17 16:20:47.972497347 +0000 UTC m=+34.353125691" observedRunningTime="2026-04-17 16:20:50.424915991 +0000 UTC m=+36.805544353" watchObservedRunningTime="2026-04-17 16:20:50.42602421 +0000 UTC m=+36.806652572" Apr 17 16:20:51.212181 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.212144 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:20:51.212385 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.212144 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:20:51.216137 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.216116 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:20:51.216137 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.216118 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:20:51.216328 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.216157 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-59zbx\"" Apr 17 16:20:51.216328 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.216164 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qr2mj\"" Apr 17 16:20:51.216328 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.216171 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:20:51.255572 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.255548 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:51.255671 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:51.255599 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:51.255718 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:51.255689 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:51.255752 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:51.255748 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert podName:dce4e627-2afb-4861-8b1a-4bf531c0f4a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:53.255734263 +0000 UTC m=+39.636362603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert") pod "ingress-canary-h5d4m" (UID: "dce4e627-2afb-4861-8b1a-4bf531c0f4a7") : secret "canary-serving-cert" not found Apr 17 16:20:51.255793 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:51.255696 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:51.255829 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:51.255812 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls podName:ccfa7faf-8272-48ab-b2fa-20b063c3b4ad nodeName:}" failed. No retries permitted until 2026-04-17 16:20:53.255799319 +0000 UTC m=+39.636427659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls") pod "dns-default-fh45r" (UID: "ccfa7faf-8272-48ab-b2fa-20b063c3b4ad") : secret "dns-default-metrics-tls" not found Apr 17 16:20:53.269302 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:53.269247 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:53.269716 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:53.269351 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:53.269716 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:53.269395 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:53.269716 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:53.269461 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert podName:dce4e627-2afb-4861-8b1a-4bf531c0f4a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:57.269444433 +0000 UTC m=+43.650072773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert") pod "ingress-canary-h5d4m" (UID: "dce4e627-2afb-4861-8b1a-4bf531c0f4a7") : secret "canary-serving-cert" not found Apr 17 16:20:53.269716 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:53.269506 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:53.269716 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:53.269561 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls podName:ccfa7faf-8272-48ab-b2fa-20b063c3b4ad nodeName:}" failed. No retries permitted until 2026-04-17 16:20:57.269545676 +0000 UTC m=+43.650174017 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls") pod "dns-default-fh45r" (UID: "ccfa7faf-8272-48ab-b2fa-20b063c3b4ad") : secret "dns-default-metrics-tls" not found Apr 17 16:20:57.297705 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:57.297515 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:20:57.298126 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:20:57.297741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:20:57.298126 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:57.297662 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:57.298126 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:57.297832 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:57.298126 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:57.297852 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert podName:dce4e627-2afb-4861-8b1a-4bf531c0f4a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:05.297832991 +0000 UTC m=+51.678461333 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert") pod "ingress-canary-h5d4m" (UID: "dce4e627-2afb-4861-8b1a-4bf531c0f4a7") : secret "canary-serving-cert" not found Apr 17 16:20:57.298126 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:20:57.297870 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls podName:ccfa7faf-8272-48ab-b2fa-20b063c3b4ad nodeName:}" failed. No retries permitted until 2026-04-17 16:21:05.297859805 +0000 UTC m=+51.678488144 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls") pod "dns-default-fh45r" (UID: "ccfa7faf-8272-48ab-b2fa-20b063c3b4ad") : secret "dns-default-metrics-tls" not found Apr 17 16:21:05.349285 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:05.349245 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:21:05.349761 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:05.349312 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:21:05.349761 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:05.349391 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:21:05.349761 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:05.349453 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:21:05.349761 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:05.349463 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls podName:ccfa7faf-8272-48ab-b2fa-20b063c3b4ad nodeName:}" failed. No retries permitted until 2026-04-17 16:21:21.349446901 +0000 UTC m=+67.730075241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls") pod "dns-default-fh45r" (UID: "ccfa7faf-8272-48ab-b2fa-20b063c3b4ad") : secret "dns-default-metrics-tls" not found Apr 17 16:21:05.349761 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:05.349508 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert podName:dce4e627-2afb-4861-8b1a-4bf531c0f4a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:21.349493709 +0000 UTC m=+67.730122054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert") pod "ingress-canary-h5d4m" (UID: "dce4e627-2afb-4861-8b1a-4bf531c0f4a7") : secret "canary-serving-cert" not found Apr 17 16:21:14.394968 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:14.394940 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpc4t" Apr 17 16:21:19.851295 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:19.851249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:21:19.853701 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:19.853679 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:21:19.861720 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:19.861697 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:21:19.861840 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:19.861770 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs podName:b74a4398-a3fb-40e5-b014-d968d4c10069 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:23.86174912 +0000 UTC m=+130.242377472 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs") pod "network-metrics-daemon-tfgvs" (UID: "b74a4398-a3fb-40e5-b014-d968d4c10069") : secret "metrics-daemon-secret" not found Apr 17 16:21:19.951992 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:19.951945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:21:19.954588 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:19.954565 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:21:19.964434 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:19.964408 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:21:19.976368 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:19.976338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgzmq\" (UniqueName: \"kubernetes.io/projected/184a3c91-ad85-4fab-a0ca-a98c92acda61-kube-api-access-fgzmq\") pod \"network-check-target-hs4nv\" (UID: \"184a3c91-ad85-4fab-a0ca-a98c92acda61\") " pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:21:20.024688 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:20.024657 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-59zbx\"" Apr 17 16:21:20.032505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:20.032480 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:21:20.154208 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:20.154177 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hs4nv"] Apr 17 16:21:20.157708 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:21:20.157683 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184a3c91_ad85_4fab_a0ca_a98c92acda61.slice/crio-e81f8e7cbb48bbebc932d9a7aa5ef95f2ffa29acd809d19c7f8d1860f99ca63b WatchSource:0}: Error finding container e81f8e7cbb48bbebc932d9a7aa5ef95f2ffa29acd809d19c7f8d1860f99ca63b: Status 404 returned error can't find the container with id e81f8e7cbb48bbebc932d9a7aa5ef95f2ffa29acd809d19c7f8d1860f99ca63b Apr 17 16:21:20.458681 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:20.458595 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hs4nv" event={"ID":"184a3c91-ad85-4fab-a0ca-a98c92acda61","Type":"ContainerStarted","Data":"e81f8e7cbb48bbebc932d9a7aa5ef95f2ffa29acd809d19c7f8d1860f99ca63b"} Apr 17 16:21:21.362971 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:21.362934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:21:21.363546 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:21.362991 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:21:21.363546 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:21.363127 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:21:21.363546 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:21.363132 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:21:21.363546 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:21.363188 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert podName:dce4e627-2afb-4861-8b1a-4bf531c0f4a7 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:53.363169336 +0000 UTC m=+99.743797678 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert") pod "ingress-canary-h5d4m" (UID: "dce4e627-2afb-4861-8b1a-4bf531c0f4a7") : secret "canary-serving-cert" not found Apr 17 16:21:21.363546 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:21.363201 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls podName:ccfa7faf-8272-48ab-b2fa-20b063c3b4ad nodeName:}" failed. No retries permitted until 2026-04-17 16:21:53.36319576 +0000 UTC m=+99.743824099 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls") pod "dns-default-fh45r" (UID: "ccfa7faf-8272-48ab-b2fa-20b063c3b4ad") : secret "dns-default-metrics-tls" not found Apr 17 16:21:23.465287 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:23.465250 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hs4nv" event={"ID":"184a3c91-ad85-4fab-a0ca-a98c92acda61","Type":"ContainerStarted","Data":"4daa79a206880f06baf84c1b48d1c0c207b3c150a1a45fa057f1b86d18de9df9"} Apr 17 16:21:23.465654 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:23.465367 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:21:23.479840 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:23.479793 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hs4nv" podStartSLOduration=66.856460535 podStartE2EDuration="1m9.479779418s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:21:20.15960296 +0000 UTC m=+66.540231299" lastFinishedPulling="2026-04-17 16:21:22.78292184 +0000 UTC m=+69.163550182" observedRunningTime="2026-04-17 16:21:23.479563131 +0000 UTC m=+69.860191492" watchObservedRunningTime="2026-04-17 16:21:23.479779418 +0000 UTC m=+69.860407819" Apr 17 16:21:30.191612 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.191483 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q"] Apr 17 16:21:30.194843 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.194823 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.196906 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.196882 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 16:21:30.197142 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.197128 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:21:30.199432 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.199411 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:21:30.199530 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.199411 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6nzbb\"" Apr 17 16:21:30.199530 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.199419 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 16:21:30.203187 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.203168 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q"] Apr 17 16:21:30.221765 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.221740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/487fdfa2-04ee-41df-9603-b59486486e7e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.221895 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.221783 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7mlz\" (UniqueName: \"kubernetes.io/projected/487fdfa2-04ee-41df-9603-b59486486e7e-kube-api-access-j7mlz\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.221895 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.221830 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.323062 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.323024 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7mlz\" (UniqueName: \"kubernetes.io/projected/487fdfa2-04ee-41df-9603-b59486486e7e-kube-api-access-j7mlz\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.323181 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.323072 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.323243 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.323198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/487fdfa2-04ee-41df-9603-b59486486e7e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.323343 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:30.323324 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:30.323421 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:30.323410 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls podName:487fdfa2-04ee-41df-9603-b59486486e7e nodeName:}" failed. No retries permitted until 2026-04-17 16:21:30.823390916 +0000 UTC m=+77.204019259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rtp7q" (UID: "487fdfa2-04ee-41df-9603-b59486486e7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:30.323927 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.323908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/487fdfa2-04ee-41df-9603-b59486486e7e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.331479 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.331461 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7mlz\" (UniqueName: \"kubernetes.io/projected/487fdfa2-04ee-41df-9603-b59486486e7e-kube-api-access-j7mlz\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.826744 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:30.826658 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:30.826928 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:30.826786 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:30.826928 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:30.826862 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls podName:487fdfa2-04ee-41df-9603-b59486486e7e nodeName:}" failed. No retries permitted until 2026-04-17 16:21:31.826846897 +0000 UTC m=+78.207475237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rtp7q" (UID: "487fdfa2-04ee-41df-9603-b59486486e7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:31.833585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:31.833534 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:31.834020 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:31.833680 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:31.834020 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:31.833750 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls podName:487fdfa2-04ee-41df-9603-b59486486e7e nodeName:}" failed. No retries permitted until 2026-04-17 16:21:33.833734086 +0000 UTC m=+80.214362440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rtp7q" (UID: "487fdfa2-04ee-41df-9603-b59486486e7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:33.850208 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:33.850169 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:33.850754 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:33.850364 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:33.850754 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:33.850470 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls podName:487fdfa2-04ee-41df-9603-b59486486e7e nodeName:}" failed. No retries permitted until 2026-04-17 16:21:37.850449269 +0000 UTC m=+84.231077610 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rtp7q" (UID: "487fdfa2-04ee-41df-9603-b59486486e7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:36.060455 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.060423 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8"] Apr 17 16:21:36.063198 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.063177 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr"] Apr 17 16:21:36.063349 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.063332 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8" Apr 17 16:21:36.065584 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.065563 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 16:21:36.065848 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.065827 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:21:36.066016 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.066000 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:36.066099 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.066086 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-m459n\"" Apr 17 16:21:36.070816 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.070800 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:21:36.070930 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.070852 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 16:21:36.070992 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.070968 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 16:21:36.071052 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.071013 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hbsq9\"" Apr 17 16:21:36.072961 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.072939 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8"] Apr 17 16:21:36.079581 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.079561 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr"] Apr 17 16:21:36.164512 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.164471 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfhh\" (UniqueName: \"kubernetes.io/projected/ce96f4eb-fab8-4101-a158-b4e35d1058c2-kube-api-access-5dfhh\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:36.164659 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.164534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lh6d\" (UniqueName: \"kubernetes.io/projected/4fd6f5cc-5c82-4053-9185-3fd37df03519-kube-api-access-6lh6d\") pod \"volume-data-source-validator-7c6cbb6c87-44ks8\" (UID: \"4fd6f5cc-5c82-4053-9185-3fd37df03519\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8" Apr 17 16:21:36.164659 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.164573 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:36.265897 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.265858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lh6d\" (UniqueName: \"kubernetes.io/projected/4fd6f5cc-5c82-4053-9185-3fd37df03519-kube-api-access-6lh6d\") pod \"volume-data-source-validator-7c6cbb6c87-44ks8\" (UID: \"4fd6f5cc-5c82-4053-9185-3fd37df03519\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8" Apr 17 16:21:36.265897 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.265898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:36.266157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.265953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfhh\" (UniqueName: \"kubernetes.io/projected/ce96f4eb-fab8-4101-a158-b4e35d1058c2-kube-api-access-5dfhh\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:36.266157 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:36.266082 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:21:36.266157 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:36.266144 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls podName:ce96f4eb-fab8-4101-a158-b4e35d1058c2 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:36.766128188 +0000 UTC m=+83.146756528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ql8qr" (UID: "ce96f4eb-fab8-4101-a158-b4e35d1058c2") : secret "samples-operator-tls" not found Apr 17 16:21:36.274653 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.274625 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfhh\" (UniqueName: \"kubernetes.io/projected/ce96f4eb-fab8-4101-a158-b4e35d1058c2-kube-api-access-5dfhh\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:36.274825 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.274809 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lh6d\" (UniqueName: \"kubernetes.io/projected/4fd6f5cc-5c82-4053-9185-3fd37df03519-kube-api-access-6lh6d\") pod \"volume-data-source-validator-7c6cbb6c87-44ks8\" (UID: \"4fd6f5cc-5c82-4053-9185-3fd37df03519\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8" Apr 17 16:21:36.317891 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.317868 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8vckd_919c2101-3bb9-439c-89fe-f84487ea8e6d/dns-node-resolver/0.log" Apr 17 16:21:36.373809 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.373773 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8" Apr 17 16:21:36.487989 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.487957 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8"] Apr 17 16:21:36.490995 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:21:36.490965 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fd6f5cc_5c82_4053_9185_3fd37df03519.slice/crio-82cee26e28899d988870c3d7fec1c152ed90363bf12cab57894d052314efeb6e WatchSource:0}: Error finding container 82cee26e28899d988870c3d7fec1c152ed90363bf12cab57894d052314efeb6e: Status 404 returned error can't find the container with id 82cee26e28899d988870c3d7fec1c152ed90363bf12cab57894d052314efeb6e Apr 17 16:21:36.769932 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:36.769840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:36.770078 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:36.769965 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:21:36.770078 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:36.770020 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls podName:ce96f4eb-fab8-4101-a158-b4e35d1058c2 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:37.770004982 +0000 UTC m=+84.150633320 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ql8qr" (UID: "ce96f4eb-fab8-4101-a158-b4e35d1058c2") : secret "samples-operator-tls" not found Apr 17 16:21:37.317307 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:37.317276 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fptfr_3237eb23-86ef-44a2-98cb-f37d4d9fb915/node-ca/0.log" Apr 17 16:21:37.492070 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:37.492036 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8" event={"ID":"4fd6f5cc-5c82-4053-9185-3fd37df03519","Type":"ContainerStarted","Data":"82cee26e28899d988870c3d7fec1c152ed90363bf12cab57894d052314efeb6e"} Apr 17 16:21:37.778855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:37.778769 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:37.779008 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:37.778938 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:21:37.779008 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:37.779000 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls podName:ce96f4eb-fab8-4101-a158-b4e35d1058c2 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:39.778985514 +0000 UTC m=+86.159613854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ql8qr" (UID: "ce96f4eb-fab8-4101-a158-b4e35d1058c2") : secret "samples-operator-tls" not found Apr 17 16:21:37.879257 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:37.879200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:37.879420 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:37.879351 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:37.879463 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:37.879426 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls podName:487fdfa2-04ee-41df-9603-b59486486e7e nodeName:}" failed. No retries permitted until 2026-04-17 16:21:45.879408731 +0000 UTC m=+92.260037069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rtp7q" (UID: "487fdfa2-04ee-41df-9603-b59486486e7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:38.495332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:38.495302 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8" event={"ID":"4fd6f5cc-5c82-4053-9185-3fd37df03519","Type":"ContainerStarted","Data":"c39c1d21c78f877a7cba0c55f013ebb6e991fa43174f99fff8255712c1bd02e4"} Apr 17 16:21:38.509397 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:38.509320 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-44ks8" podStartSLOduration=0.980097916 podStartE2EDuration="2.509304854s" podCreationTimestamp="2026-04-17 16:21:36 +0000 UTC" firstStartedPulling="2026-04-17 16:21:36.492822276 +0000 UTC m=+82.873450615" lastFinishedPulling="2026-04-17 16:21:38.0220292 +0000 UTC m=+84.402657553" observedRunningTime="2026-04-17 16:21:38.508675592 +0000 UTC m=+84.889303965" watchObservedRunningTime="2026-04-17 16:21:38.509304854 +0000 UTC m=+84.889933215" Apr 17 16:21:39.792758 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:39.792718 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:39.793154 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:39.792864 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:21:39.793154 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:39.792930 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls podName:ce96f4eb-fab8-4101-a158-b4e35d1058c2 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:43.792913218 +0000 UTC m=+90.173541556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ql8qr" (UID: "ce96f4eb-fab8-4101-a158-b4e35d1058c2") : secret "samples-operator-tls" not found Apr 17 16:21:40.202210 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.202176 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns"] Apr 17 16:21:40.204985 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.204970 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.207493 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.207466 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 16:21:40.208474 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.208443 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 16:21:40.208474 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.208461 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-trvpm\"" Apr 17 16:21:40.208474 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.208469 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:21:40.208694 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.208495 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 16:21:40.218170 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.215588 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns"] Apr 17 16:21:40.296425 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.296387 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae9cb04-c427-4571-bac6-8e89f37be1c0-config\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.296425 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.296425 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xrv\" (UniqueName: \"kubernetes.io/projected/0ae9cb04-c427-4571-bac6-8e89f37be1c0-kube-api-access-49xrv\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.296641 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.296502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae9cb04-c427-4571-bac6-8e89f37be1c0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.397186 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.397153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae9cb04-c427-4571-bac6-8e89f37be1c0-config\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.397186 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.397188 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49xrv\" (UniqueName: \"kubernetes.io/projected/0ae9cb04-c427-4571-bac6-8e89f37be1c0-kube-api-access-49xrv\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.397363 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.397257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae9cb04-c427-4571-bac6-8e89f37be1c0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.397810 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.397788 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae9cb04-c427-4571-bac6-8e89f37be1c0-config\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.399511 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.399494 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ae9cb04-c427-4571-bac6-8e89f37be1c0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.424249 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.423710 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xrv\" (UniqueName: \"kubernetes.io/projected/0ae9cb04-c427-4571-bac6-8e89f37be1c0-kube-api-access-49xrv\") pod \"service-ca-operator-d6fc45fc5-mm8ns\" (UID: \"0ae9cb04-c427-4571-bac6-8e89f37be1c0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.514870 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.514791 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" Apr 17 16:21:40.624475 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:40.624444 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns"] Apr 17 16:21:40.627726 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:21:40.627701 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae9cb04_c427_4571_bac6_8e89f37be1c0.slice/crio-a3a73299c95be17ea573265038774456ba3d077825b41d9a7c56f491f36b87bd WatchSource:0}: Error finding container a3a73299c95be17ea573265038774456ba3d077825b41d9a7c56f491f36b87bd: Status 404 returned error can't find the container with id a3a73299c95be17ea573265038774456ba3d077825b41d9a7c56f491f36b87bd Apr 17 16:21:41.501475 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:41.501437 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" event={"ID":"0ae9cb04-c427-4571-bac6-8e89f37be1c0","Type":"ContainerStarted","Data":"a3a73299c95be17ea573265038774456ba3d077825b41d9a7c56f491f36b87bd"} Apr 17 16:21:43.507627 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:43.507593 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" event={"ID":"0ae9cb04-c427-4571-bac6-8e89f37be1c0","Type":"ContainerStarted","Data":"b9cc289ef714c754405b7e4a0bc05f5bb063f232852033e7311111c56ccb55a9"} Apr 17 16:21:43.523571 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:43.523521 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" podStartSLOduration=1.22509636 podStartE2EDuration="3.523505718s" podCreationTimestamp="2026-04-17 16:21:40 +0000 UTC" firstStartedPulling="2026-04-17 16:21:40.62948216 +0000 UTC m=+87.010110499" lastFinishedPulling="2026-04-17 16:21:42.927891508 +0000 UTC m=+89.308519857" observedRunningTime="2026-04-17 16:21:43.522740273 +0000 UTC m=+89.903368635" watchObservedRunningTime="2026-04-17 16:21:43.523505718 +0000 UTC m=+89.904134118" Apr 17 16:21:43.820965 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:43.820936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:43.821147 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:43.821105 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:21:43.821214 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:43.821187 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls podName:ce96f4eb-fab8-4101-a158-b4e35d1058c2 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:51.821165934 +0000 UTC m=+98.201794292 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ql8qr" (UID: "ce96f4eb-fab8-4101-a158-b4e35d1058c2") : secret "samples-operator-tls" not found Apr 17 16:21:44.215166 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.215090 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws"] Apr 17 16:21:44.217915 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.217899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" Apr 17 16:21:44.220207 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.220182 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 16:21:44.221089 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.221071 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 16:21:44.221175 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.221097 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-tgttk\"" Apr 17 16:21:44.224818 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.224794 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws"] Apr 17 16:21:44.324926 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.324882 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjvvm\" (UniqueName: \"kubernetes.io/projected/2e21120d-2f0e-4730-ad47-2a2a7275109d-kube-api-access-zjvvm\") pod \"migrator-74bb7799d9-dtwws\" (UID: \"2e21120d-2f0e-4730-ad47-2a2a7275109d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" Apr 17 16:21:44.425377 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.425337 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjvvm\" (UniqueName: \"kubernetes.io/projected/2e21120d-2f0e-4730-ad47-2a2a7275109d-kube-api-access-zjvvm\") pod \"migrator-74bb7799d9-dtwws\" (UID: \"2e21120d-2f0e-4730-ad47-2a2a7275109d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" Apr 17 16:21:44.433680 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.433654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjvvm\" (UniqueName: \"kubernetes.io/projected/2e21120d-2f0e-4730-ad47-2a2a7275109d-kube-api-access-zjvvm\") pod \"migrator-74bb7799d9-dtwws\" (UID: \"2e21120d-2f0e-4730-ad47-2a2a7275109d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" Apr 17 16:21:44.526559 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.526470 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" Apr 17 16:21:44.641184 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:44.641148 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws"] Apr 17 16:21:44.644026 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:21:44.643992 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e21120d_2f0e_4730_ad47_2a2a7275109d.slice/crio-9b30e0befdf079f56f07cbef9576c0c2e5f5038af7f6cfec85ecd8e396d15c9d WatchSource:0}: Error finding container 9b30e0befdf079f56f07cbef9576c0c2e5f5038af7f6cfec85ecd8e396d15c9d: Status 404 returned error can't find the container with id 9b30e0befdf079f56f07cbef9576c0c2e5f5038af7f6cfec85ecd8e396d15c9d Apr 17 16:21:45.288844 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.288807 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6988f59bd7-zttj5"] Apr 17 16:21:45.292927 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.292911 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.295441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.295421 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:21:45.295441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.295429 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5mbwd\"" Apr 17 16:21:45.295581 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.295457 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:21:45.295771 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.295759 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:21:45.299651 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.299629 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:21:45.303573 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.303554 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6988f59bd7-zttj5"] Apr 17 16:21:45.333965 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.333929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-installation-pull-secrets\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.333965 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.333970 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-image-registry-private-configuration\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.334157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.334064 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5zvw\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-kube-api-access-t5zvw\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.334157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.334091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.334157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.334113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-trusted-ca\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.334157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.334137 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-registry-certificates\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.334311 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.334187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-bound-sa-token\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.334311 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.334213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c86881cb-f096-4917-ba77-b03ea33790c7-ca-trust-extracted\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435027 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.434969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-trusted-ca\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435027 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-registry-certificates\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435281 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-bound-sa-token\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435281 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c86881cb-f096-4917-ba77-b03ea33790c7-ca-trust-extracted\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435380 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-installation-pull-secrets\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435380 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-image-registry-private-configuration\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435489 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5zvw\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-kube-api-access-t5zvw\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435489 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435579 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:45.435549 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:21:45.435579 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:45.435563 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6988f59bd7-zttj5: secret "image-registry-tls" not found Apr 17 16:21:45.435676 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435579 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c86881cb-f096-4917-ba77-b03ea33790c7-ca-trust-extracted\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.435676 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:45.435626 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls podName:c86881cb-f096-4917-ba77-b03ea33790c7 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:45.935607294 +0000 UTC m=+92.316235639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls") pod "image-registry-6988f59bd7-zttj5" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7") : secret "image-registry-tls" not found Apr 17 16:21:45.435878 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-registry-certificates\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.436006 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.435989 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-trusted-ca\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.437747 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.437730 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-installation-pull-secrets\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.437796 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.437763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-image-registry-private-configuration\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.443540 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.443516 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5zvw\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-kube-api-access-t5zvw\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.443651 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.443606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-bound-sa-token\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.512150 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.512110 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" event={"ID":"2e21120d-2f0e-4730-ad47-2a2a7275109d","Type":"ContainerStarted","Data":"9b30e0befdf079f56f07cbef9576c0c2e5f5038af7f6cfec85ecd8e396d15c9d"} Apr 17 16:21:45.940224 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.940186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:21:45.940676 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:45.940288 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:45.940676 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:45.940318 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:45.940676 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:45.940387 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls podName:487fdfa2-04ee-41df-9603-b59486486e7e nodeName:}" failed. No retries permitted until 2026-04-17 16:22:01.940362784 +0000 UTC m=+108.320991128 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-rtp7q" (UID: "487fdfa2-04ee-41df-9603-b59486486e7e") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:21:45.940676 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:45.940409 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:21:45.940676 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:45.940423 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6988f59bd7-zttj5: secret "image-registry-tls" not found Apr 17 16:21:45.940676 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:45.940475 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls podName:c86881cb-f096-4917-ba77-b03ea33790c7 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:46.940459617 +0000 UTC m=+93.321087958 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls") pod "image-registry-6988f59bd7-zttj5" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7") : secret "image-registry-tls" not found Apr 17 16:21:46.495953 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.495912 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4q2rw"] Apr 17 16:21:46.498970 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.498953 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.501590 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.501520 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 16:21:46.501590 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.501573 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 16:21:46.501779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.501573 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 16:21:46.502509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.502491 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-twdrv\"" Apr 17 16:21:46.502619 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.502513 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 16:21:46.508288 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.508263 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4q2rw"] Apr 17 16:21:46.544441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.544406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47d5b197-2a30-4509-ab08-43693d2de2b6-signing-key\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.545068 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.544460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r86v\" (UniqueName: \"kubernetes.io/projected/47d5b197-2a30-4509-ab08-43693d2de2b6-kube-api-access-8r86v\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.545068 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.545018 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47d5b197-2a30-4509-ab08-43693d2de2b6-signing-cabundle\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.646303 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.646260 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47d5b197-2a30-4509-ab08-43693d2de2b6-signing-cabundle\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.646477 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.646379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47d5b197-2a30-4509-ab08-43693d2de2b6-signing-key\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.646477 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.646411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r86v\" (UniqueName: \"kubernetes.io/projected/47d5b197-2a30-4509-ab08-43693d2de2b6-kube-api-access-8r86v\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.647094 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.647067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/47d5b197-2a30-4509-ab08-43693d2de2b6-signing-cabundle\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.649200 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.649176 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/47d5b197-2a30-4509-ab08-43693d2de2b6-signing-key\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.655876 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.655854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r86v\" (UniqueName: \"kubernetes.io/projected/47d5b197-2a30-4509-ab08-43693d2de2b6-kube-api-access-8r86v\") pod \"service-ca-865cb79987-4q2rw\" (UID: \"47d5b197-2a30-4509-ab08-43693d2de2b6\") " pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.810405 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.810379 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-4q2rw" Apr 17 16:21:46.924102 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.924070 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4q2rw"] Apr 17 16:21:46.927211 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:21:46.927179 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47d5b197_2a30_4509_ab08_43693d2de2b6.slice/crio-a7717fb4edb30ac5fde6d3e6e2c42f54a6648bf0f6203f66fbca27f7107168e9 WatchSource:0}: Error finding container a7717fb4edb30ac5fde6d3e6e2c42f54a6648bf0f6203f66fbca27f7107168e9: Status 404 returned error can't find the container with id a7717fb4edb30ac5fde6d3e6e2c42f54a6648bf0f6203f66fbca27f7107168e9 Apr 17 16:21:46.948890 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:46.948867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:46.949170 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:46.949009 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:21:46.949170 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:46.949024 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6988f59bd7-zttj5: secret "image-registry-tls" not found Apr 17 16:21:46.949170 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:46.949073 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls podName:c86881cb-f096-4917-ba77-b03ea33790c7 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:48.949058331 +0000 UTC m=+95.329686670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls") pod "image-registry-6988f59bd7-zttj5" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7") : secret "image-registry-tls" not found Apr 17 16:21:47.517789 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:47.517751 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-4q2rw" event={"ID":"47d5b197-2a30-4509-ab08-43693d2de2b6","Type":"ContainerStarted","Data":"f0ed81ed3ca981464a5e752bafa3a89453701ce164f56a814f8c383c24417449"} Apr 17 16:21:47.517789 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:47.517793 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-4q2rw" event={"ID":"47d5b197-2a30-4509-ab08-43693d2de2b6","Type":"ContainerStarted","Data":"a7717fb4edb30ac5fde6d3e6e2c42f54a6648bf0f6203f66fbca27f7107168e9"} Apr 17 16:21:47.519282 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:47.519257 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" event={"ID":"2e21120d-2f0e-4730-ad47-2a2a7275109d","Type":"ContainerStarted","Data":"cc9622eaf472685ec7823de8dd3bf65999d2fe5308cceeba05e661b07b858aed"} Apr 17 16:21:47.519391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:47.519287 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" event={"ID":"2e21120d-2f0e-4730-ad47-2a2a7275109d","Type":"ContainerStarted","Data":"7d82ba279166ff0863f2ac613fdc491d0c1a1efc904fe57df4f88171dfeece61"} Apr 17 16:21:47.534322 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:47.534277 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-4q2rw" podStartSLOduration=1.534263185 podStartE2EDuration="1.534263185s" podCreationTimestamp="2026-04-17 16:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:21:47.533474304 +0000 UTC m=+93.914102690" watchObservedRunningTime="2026-04-17 16:21:47.534263185 +0000 UTC m=+93.914891545" Apr 17 16:21:47.548127 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:47.547973 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-dtwws" podStartSLOduration=1.462202901 podStartE2EDuration="3.547959184s" podCreationTimestamp="2026-04-17 16:21:44 +0000 UTC" firstStartedPulling="2026-04-17 16:21:44.645917564 +0000 UTC m=+91.026545904" lastFinishedPulling="2026-04-17 16:21:46.731673845 +0000 UTC m=+93.112302187" observedRunningTime="2026-04-17 16:21:47.547879043 +0000 UTC m=+93.928507406" watchObservedRunningTime="2026-04-17 16:21:47.547959184 +0000 UTC m=+93.928587545" Apr 17 16:21:48.964068 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:48.964016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:48.964558 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:48.964191 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:21:48.964558 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:48.964217 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6988f59bd7-zttj5: secret "image-registry-tls" not found Apr 17 16:21:48.964558 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:21:48.964301 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls podName:c86881cb-f096-4917-ba77-b03ea33790c7 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:52.96427999 +0000 UTC m=+99.344908334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls") pod "image-registry-6988f59bd7-zttj5" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7") : secret "image-registry-tls" not found Apr 17 16:21:51.886637 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:51.886596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:51.888959 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:51.888914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce96f4eb-fab8-4101-a158-b4e35d1058c2-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ql8qr\" (UID: \"ce96f4eb-fab8-4101-a158-b4e35d1058c2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:51.979414 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:51.979377 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" Apr 17 16:21:52.098958 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:52.098919 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr"] Apr 17 16:21:52.532793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:52.532714 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" event={"ID":"ce96f4eb-fab8-4101-a158-b4e35d1058c2","Type":"ContainerStarted","Data":"40a2c5ee7c95077b648685b8917820795c59b2c9b40489659f0e3b97b9830d3a"} Apr 17 16:21:52.994582 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:52.994532 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:52.997718 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:52.997689 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") pod \"image-registry-6988f59bd7-zttj5\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:53.102365 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.102326 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:53.227982 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.227947 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6988f59bd7-zttj5"] Apr 17 16:21:53.397786 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.397748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:21:53.397977 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.397843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:21:53.402557 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.402498 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dce4e627-2afb-4861-8b1a-4bf531c0f4a7-cert\") pod \"ingress-canary-h5d4m\" (UID: \"dce4e627-2afb-4861-8b1a-4bf531c0f4a7\") " pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:21:53.403129 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.403079 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ccfa7faf-8272-48ab-b2fa-20b063c3b4ad-metrics-tls\") pod \"dns-default-fh45r\" (UID: \"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad\") " pod="openshift-dns/dns-default-fh45r" Apr 17 16:21:53.412966 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.412769 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9rhb9\"" Apr 17 16:21:53.420766 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.420744 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h5d4m" Apr 17 16:21:53.537351 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.537306 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" event={"ID":"c86881cb-f096-4917-ba77-b03ea33790c7","Type":"ContainerStarted","Data":"4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9"} Apr 17 16:21:53.537351 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.537346 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" event={"ID":"c86881cb-f096-4917-ba77-b03ea33790c7","Type":"ContainerStarted","Data":"460dc60e1cd5eba2f6756ef574aeb9a837e9f092dae0a6301425b9682545e8f2"} Apr 17 16:21:53.537834 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.537798 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:21:53.553072 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.553020 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h5d4m"] Apr 17 16:21:53.698942 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.698908 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vgznp\"" Apr 17 16:21:53.707032 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:53.706986 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fh45r" Apr 17 16:21:54.231331 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:54.231180 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" podStartSLOduration=9.231159643 podStartE2EDuration="9.231159643s" podCreationTimestamp="2026-04-17 16:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:21:53.557936066 +0000 UTC m=+99.938564428" watchObservedRunningTime="2026-04-17 16:21:54.231159643 +0000 UTC m=+100.611788005" Apr 17 16:21:54.231730 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:54.231603 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fh45r"] Apr 17 16:21:54.469459 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:54.469378 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hs4nv" Apr 17 16:21:54.541461 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:54.541421 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h5d4m" event={"ID":"dce4e627-2afb-4861-8b1a-4bf531c0f4a7","Type":"ContainerStarted","Data":"d9239e9dd80a76b9937db9a51c2e58d079d23121478e2b5384b8d1a91bfccc94"} Apr 17 16:21:54.542748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:54.542713 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fh45r" event={"ID":"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad","Type":"ContainerStarted","Data":"51f67c1f0bb8620f43c11ac7627dea1e9402cad84a4b6d462c6370ed0836cc11"} Apr 17 16:21:54.544787 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:54.544717 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" event={"ID":"ce96f4eb-fab8-4101-a158-b4e35d1058c2","Type":"ContainerStarted","Data":"6fd5145179eadaf0894ebd00f28c99445baddc0e1c75ef86dc1cddaf6ea41ad6"} Apr 17 16:21:54.544787 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:54.544759 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" event={"ID":"ce96f4eb-fab8-4101-a158-b4e35d1058c2","Type":"ContainerStarted","Data":"0d6d8727371be147c5b66e79d8de7c39e480c0f112ba3ce2ad900b35f35d73c9"} Apr 17 16:21:54.561125 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:54.561073 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ql8qr" podStartSLOduration=16.550216741 podStartE2EDuration="18.561053135s" podCreationTimestamp="2026-04-17 16:21:36 +0000 UTC" firstStartedPulling="2026-04-17 16:21:52.139861135 +0000 UTC m=+98.520489474" lastFinishedPulling="2026-04-17 16:21:54.150697511 +0000 UTC m=+100.531325868" observedRunningTime="2026-04-17 16:21:54.560087259 +0000 UTC m=+100.940715622" watchObservedRunningTime="2026-04-17 16:21:54.561053135 +0000 UTC m=+100.941681492" Apr 17 16:21:56.552587 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:56.552552 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h5d4m" event={"ID":"dce4e627-2afb-4861-8b1a-4bf531c0f4a7","Type":"ContainerStarted","Data":"bfc0ff0f5d0e2c9f4dbb2fc1edd4bce6ef8861bea8fa8c394184acbef8ac1c0f"} Apr 17 16:21:56.567695 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:56.567639 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-h5d4m" podStartSLOduration=65.317008825 podStartE2EDuration="1m7.56762067s" podCreationTimestamp="2026-04-17 16:20:49 +0000 UTC" firstStartedPulling="2026-04-17 16:21:53.560313051 +0000 UTC m=+99.940941395" lastFinishedPulling="2026-04-17 16:21:55.810924901 +0000 UTC m=+102.191553240" observedRunningTime="2026-04-17 16:21:56.567151352 +0000 UTC m=+102.947779713" watchObservedRunningTime="2026-04-17 16:21:56.56762067 +0000 UTC m=+102.948249032" Apr 17 16:21:57.557251 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:57.557200 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fh45r" event={"ID":"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad","Type":"ContainerStarted","Data":"e1deaa6102908f613c07ac33a043488d9cee46284e555efe92a3ece665372967"} Apr 17 16:21:57.557251 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:57.557255 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fh45r" event={"ID":"ccfa7faf-8272-48ab-b2fa-20b063c3b4ad","Type":"ContainerStarted","Data":"c2be9ac1499a8eae6f00ff8975b1e0ba62a48851d3b18838c1a55f58d882fe34"} Apr 17 16:21:57.572828 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:57.572786 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fh45r" podStartSLOduration=65.893917925 podStartE2EDuration="1m8.572772619s" podCreationTimestamp="2026-04-17 16:20:49 +0000 UTC" firstStartedPulling="2026-04-17 16:21:54.241359798 +0000 UTC m=+100.621988141" lastFinishedPulling="2026-04-17 16:21:56.920214495 +0000 UTC m=+103.300842835" observedRunningTime="2026-04-17 16:21:57.572474533 +0000 UTC m=+103.953102898" watchObservedRunningTime="2026-04-17 16:21:57.572772619 +0000 UTC m=+103.953400979" Apr 17 16:21:58.562929 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:21:58.562897 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fh45r" Apr 17 16:22:01.967391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:01.967345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:22:01.969822 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:01.969802 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/487fdfa2-04ee-41df-9603-b59486486e7e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-rtp7q\" (UID: \"487fdfa2-04ee-41df-9603-b59486486e7e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:22:02.003820 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:02.003781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" Apr 17 16:22:02.118112 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:02.118082 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q"] Apr 17 16:22:02.121201 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:02.121171 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487fdfa2_04ee_41df_9603_b59486486e7e.slice/crio-3c4e0181a441380d83542d56faf128d649a6d76bf120c75b72fb950990da831c WatchSource:0}: Error finding container 3c4e0181a441380d83542d56faf128d649a6d76bf120c75b72fb950990da831c: Status 404 returned error can't find the container with id 3c4e0181a441380d83542d56faf128d649a6d76bf120c75b72fb950990da831c Apr 17 16:22:02.574852 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:02.574816 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" event={"ID":"487fdfa2-04ee-41df-9603-b59486486e7e","Type":"ContainerStarted","Data":"3c4e0181a441380d83542d56faf128d649a6d76bf120c75b72fb950990da831c"} Apr 17 16:22:04.585364 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:04.585324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" event={"ID":"487fdfa2-04ee-41df-9603-b59486486e7e","Type":"ContainerStarted","Data":"5466173b0ddf5f9efdc121f1e0d1c910d6b319abd77d182a53720c44fb553290"} Apr 17 16:22:04.600012 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:04.599959 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-rtp7q" podStartSLOduration=32.634386555 podStartE2EDuration="34.599946707s" podCreationTimestamp="2026-04-17 16:21:30 +0000 UTC" firstStartedPulling="2026-04-17 16:22:02.124799614 +0000 UTC m=+108.505427953" lastFinishedPulling="2026-04-17 16:22:04.090359767 +0000 UTC m=+110.470988105" observedRunningTime="2026-04-17 16:22:04.599186409 +0000 UTC m=+110.979814770" watchObservedRunningTime="2026-04-17 16:22:04.599946707 +0000 UTC m=+110.980575067" Apr 17 16:22:06.019886 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.019853 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6988f59bd7-zttj5"] Apr 17 16:22:06.066463 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.066430 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw"] Apr 17 16:22:06.068411 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.068395 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" Apr 17 16:22:06.070824 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.070799 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 16:22:06.070913 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.070836 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-rlmx2\"" Apr 17 16:22:06.080978 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.080958 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw"] Apr 17 16:22:06.086441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.086422 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gz7tf"] Apr 17 16:22:06.088550 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.088528 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.090879 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.090860 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:22:06.090957 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.090897 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:22:06.090957 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.090905 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8zcsn\"" Apr 17 16:22:06.091136 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.091121 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:22:06.091375 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.091359 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:22:06.098686 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.098666 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gz7tf"] Apr 17 16:22:06.199600 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.199568 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.199600 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.199607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.199827 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.199630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fdd67f61-79df-48f1-af95-85e192096fa7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xzkcw\" (UID: \"fdd67f61-79df-48f1-af95-85e192096fa7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" Apr 17 16:22:06.199827 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.199650 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-crio-socket\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.199827 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.199754 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6nx6\" (UniqueName: \"kubernetes.io/projected/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-kube-api-access-z6nx6\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.199827 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.199795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-data-volume\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.300789 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.300693 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.300789 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.300730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.301006 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.300828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fdd67f61-79df-48f1-af95-85e192096fa7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xzkcw\" (UID: \"fdd67f61-79df-48f1-af95-85e192096fa7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" Apr 17 16:22:06.301006 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.300869 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-crio-socket\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.301006 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.300921 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6nx6\" (UniqueName: \"kubernetes.io/projected/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-kube-api-access-z6nx6\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.301006 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.300972 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-data-volume\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.301311 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.301256 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-crio-socket\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.301311 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.301303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-data-volume\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.301456 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.301438 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.303171 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.303153 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.303748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.303732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fdd67f61-79df-48f1-af95-85e192096fa7-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-xzkcw\" (UID: \"fdd67f61-79df-48f1-af95-85e192096fa7\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" Apr 17 16:22:06.308645 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.308624 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6nx6\" (UniqueName: \"kubernetes.io/projected/bdfda63a-84fe-48c6-817f-4ccccdc6ceae-kube-api-access-z6nx6\") pod \"insights-runtime-extractor-gz7tf\" (UID: \"bdfda63a-84fe-48c6-817f-4ccccdc6ceae\") " pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.376946 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.376906 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" Apr 17 16:22:06.396740 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.396704 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gz7tf" Apr 17 16:22:06.519567 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.519532 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw"] Apr 17 16:22:06.525294 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:06.525066 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd67f61_79df_48f1_af95_85e192096fa7.slice/crio-9be7a3196128a1588b7a8723b2488934110d67f7deaa55c42bb91725d284b97c WatchSource:0}: Error finding container 9be7a3196128a1588b7a8723b2488934110d67f7deaa55c42bb91725d284b97c: Status 404 returned error can't find the container with id 9be7a3196128a1588b7a8723b2488934110d67f7deaa55c42bb91725d284b97c Apr 17 16:22:06.553035 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.552962 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gz7tf"] Apr 17 16:22:06.556134 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:06.556108 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdfda63a_84fe_48c6_817f_4ccccdc6ceae.slice/crio-d5f40206601cace240cdb60226fd7e1991e703b93459dfb234b5308eb88d91c8 WatchSource:0}: Error finding container d5f40206601cace240cdb60226fd7e1991e703b93459dfb234b5308eb88d91c8: Status 404 returned error can't find the container with id d5f40206601cace240cdb60226fd7e1991e703b93459dfb234b5308eb88d91c8 Apr 17 16:22:06.591040 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.591011 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gz7tf" event={"ID":"bdfda63a-84fe-48c6-817f-4ccccdc6ceae","Type":"ContainerStarted","Data":"d5f40206601cace240cdb60226fd7e1991e703b93459dfb234b5308eb88d91c8"} Apr 17 16:22:06.592145 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:06.592110 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" event={"ID":"fdd67f61-79df-48f1-af95-85e192096fa7","Type":"ContainerStarted","Data":"9be7a3196128a1588b7a8723b2488934110d67f7deaa55c42bb91725d284b97c"} Apr 17 16:22:07.596647 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:07.596609 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gz7tf" event={"ID":"bdfda63a-84fe-48c6-817f-4ccccdc6ceae","Type":"ContainerStarted","Data":"cfc774e67324e57d15abdecc9f7f24cecb2973e9237615a26949bcbc97b132d4"} Apr 17 16:22:07.596647 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:07.596644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gz7tf" event={"ID":"bdfda63a-84fe-48c6-817f-4ccccdc6ceae","Type":"ContainerStarted","Data":"6ea3b428a31c7edf7732aca8dc98b2e6d69f5d9f92e5d400fc6375454d0d09c0"} Apr 17 16:22:08.567503 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:08.567461 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fh45r" Apr 17 16:22:08.602444 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:08.602394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" event={"ID":"fdd67f61-79df-48f1-af95-85e192096fa7","Type":"ContainerStarted","Data":"ca5454f7e6bad56cbb6e017d873c918befcf8d479ed35f5bd1ce9909f5e2926e"} Apr 17 16:22:08.602897 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:08.602716 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" Apr 17 16:22:08.608322 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:08.608296 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" Apr 17 16:22:08.620828 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:08.620750 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-xzkcw" podStartSLOduration=1.11074974 podStartE2EDuration="2.620730394s" podCreationTimestamp="2026-04-17 16:22:06 +0000 UTC" firstStartedPulling="2026-04-17 16:22:06.527392263 +0000 UTC m=+112.908020604" lastFinishedPulling="2026-04-17 16:22:08.037372916 +0000 UTC m=+114.418001258" observedRunningTime="2026-04-17 16:22:08.61985647 +0000 UTC m=+115.000484830" watchObservedRunningTime="2026-04-17 16:22:08.620730394 +0000 UTC m=+115.001358757" Apr 17 16:22:09.596836 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.596798 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jdrxz"] Apr 17 16:22:09.599001 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.598978 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.601609 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.601583 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 16:22:09.601736 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.601648 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 16:22:09.601793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.601740 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-xqhvm\"" Apr 17 16:22:09.602889 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.602871 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:22:09.607445 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.607406 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gz7tf" event={"ID":"bdfda63a-84fe-48c6-817f-4ccccdc6ceae","Type":"ContainerStarted","Data":"4d47ad779abc4a30aa6387fe2e1f2a3d029e48ff697cc7cda46f1f1490c04e5c"} Apr 17 16:22:09.608047 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.608024 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jdrxz"] Apr 17 16:22:09.637654 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.637606 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gz7tf" podStartSLOduration=0.961279099 podStartE2EDuration="3.63759275s" podCreationTimestamp="2026-04-17 16:22:06 +0000 UTC" firstStartedPulling="2026-04-17 16:22:06.611054971 +0000 UTC m=+112.991683310" lastFinishedPulling="2026-04-17 16:22:09.287368612 +0000 UTC m=+115.667996961" observedRunningTime="2026-04-17 16:22:09.636529618 +0000 UTC m=+116.017157980" watchObservedRunningTime="2026-04-17 16:22:09.63759275 +0000 UTC m=+116.018221112" Apr 17 16:22:09.726640 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.726608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlx75\" (UniqueName: \"kubernetes.io/projected/b2347694-fcef-49a0-9562-dc4a50f629e0-kube-api-access-hlx75\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.726802 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.726670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.726885 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.726793 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.726885 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.726873 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2347694-fcef-49a0-9562-dc4a50f629e0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.827974 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.827935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2347694-fcef-49a0-9562-dc4a50f629e0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.827974 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.827974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlx75\" (UniqueName: \"kubernetes.io/projected/b2347694-fcef-49a0-9562-dc4a50f629e0-kube-api-access-hlx75\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.828214 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.827998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.828214 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.828031 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.828214 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:22:09.828156 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 16:22:09.828391 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:22:09.828221 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-tls podName:b2347694-fcef-49a0-9562-dc4a50f629e0 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:10.328203092 +0000 UTC m=+116.708831434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-jdrxz" (UID: "b2347694-fcef-49a0-9562-dc4a50f629e0") : secret "prometheus-operator-tls" not found Apr 17 16:22:09.828713 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.828695 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2347694-fcef-49a0-9562-dc4a50f629e0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.830401 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.830383 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:09.836924 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:09.836903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlx75\" (UniqueName: \"kubernetes.io/projected/b2347694-fcef-49a0-9562-dc4a50f629e0-kube-api-access-hlx75\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:10.332782 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:10.332735 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:10.335125 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:10.335108 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2347694-fcef-49a0-9562-dc4a50f629e0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jdrxz\" (UID: \"b2347694-fcef-49a0-9562-dc4a50f629e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:10.509830 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:10.509778 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" Apr 17 16:22:10.630048 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:10.629993 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jdrxz"] Apr 17 16:22:10.633495 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:10.633463 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2347694_fcef_49a0_9562_dc4a50f629e0.slice/crio-326851f6c8a8e72d8058e1210d3ab491a84d5bee4f7f044f2fb3b99431047669 WatchSource:0}: Error finding container 326851f6c8a8e72d8058e1210d3ab491a84d5bee4f7f044f2fb3b99431047669: Status 404 returned error can't find the container with id 326851f6c8a8e72d8058e1210d3ab491a84d5bee4f7f044f2fb3b99431047669 Apr 17 16:22:11.612586 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:11.612548 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" event={"ID":"b2347694-fcef-49a0-9562-dc4a50f629e0","Type":"ContainerStarted","Data":"326851f6c8a8e72d8058e1210d3ab491a84d5bee4f7f044f2fb3b99431047669"} Apr 17 16:22:12.617443 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:12.617410 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" event={"ID":"b2347694-fcef-49a0-9562-dc4a50f629e0","Type":"ContainerStarted","Data":"6db9212d8265f70e60415e1a48b904da9f3ceacb2c59f92aadb6dbf713052f06"} Apr 17 16:22:12.617443 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:12.617448 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" event={"ID":"b2347694-fcef-49a0-9562-dc4a50f629e0","Type":"ContainerStarted","Data":"dff9d3c894d6bbe5f88a939aa5d6a0dc5e3f45580cfde39e9d270cd118d12bc9"} Apr 17 16:22:12.633777 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:12.633734 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jdrxz" podStartSLOduration=2.256853856 podStartE2EDuration="3.633721252s" podCreationTimestamp="2026-04-17 16:22:09 +0000 UTC" firstStartedPulling="2026-04-17 16:22:10.635382696 +0000 UTC m=+117.016011035" lastFinishedPulling="2026-04-17 16:22:12.012250085 +0000 UTC m=+118.392878431" observedRunningTime="2026-04-17 16:22:12.632567617 +0000 UTC m=+119.013195978" watchObservedRunningTime="2026-04-17 16:22:12.633721252 +0000 UTC m=+119.014349614" Apr 17 16:22:14.927532 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:14.927495 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v5n6c"] Apr 17 16:22:14.930913 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:14.930878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:14.933582 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:14.933481 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:22:14.933582 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:14.933564 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:22:14.933782 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:14.933743 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:22:14.933839 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:14.933781 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-z7htb\"" Apr 17 16:22:15.072251 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.072155 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-sys\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.072551 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.072521 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-textfile\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.072824 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.072760 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-accelerators-collector-config\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.073008 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.072964 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-root\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.073190 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.073142 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.073339 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.073324 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4ts6\" (UniqueName: \"kubernetes.io/projected/4b73a405-15f3-43c5-bf6a-43a8219a181a-kube-api-access-t4ts6\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.073494 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.073474 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-wtmp\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.073624 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.073611 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b73a405-15f3-43c5-bf6a-43a8219a181a-metrics-client-ca\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.073756 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.073744 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-tls\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.174967 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.174929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.174978 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4ts6\" (UniqueName: \"kubernetes.io/projected/4b73a405-15f3-43c5-bf6a-43a8219a181a-kube-api-access-t4ts6\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-wtmp\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175034 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b73a405-15f3-43c5-bf6a-43a8219a181a-metrics-client-ca\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-tls\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-sys\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-textfile\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175160 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-accelerators-collector-config\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-root\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-root\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.175804 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:22:15.175781 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:22:15.175872 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:22:15.175854 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-tls podName:4b73a405-15f3-43c5-bf6a-43a8219a181a nodeName:}" failed. No retries permitted until 2026-04-17 16:22:15.675832751 +0000 UTC m=+122.056461097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-tls") pod "node-exporter-v5n6c" (UID: "4b73a405-15f3-43c5-bf6a-43a8219a181a") : secret "node-exporter-tls" not found Apr 17 16:22:15.176002 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.175981 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-textfile\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.176071 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.176050 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-sys\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.176519 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.176267 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-wtmp\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.176668 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.176538 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-accelerators-collector-config\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.176755 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.176726 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4b73a405-15f3-43c5-bf6a-43a8219a181a-metrics-client-ca\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.178949 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.178831 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.185332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.185311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4ts6\" (UniqueName: \"kubernetes.io/projected/4b73a405-15f3-43c5-bf6a-43a8219a181a-kube-api-access-t4ts6\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.680685 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.680654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-tls\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.683059 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.683032 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4b73a405-15f3-43c5-bf6a-43a8219a181a-node-exporter-tls\") pod \"node-exporter-v5n6c\" (UID: \"4b73a405-15f3-43c5-bf6a-43a8219a181a\") " pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.842843 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:15.842815 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v5n6c" Apr 17 16:22:15.851871 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:15.851839 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b73a405_15f3_43c5_bf6a_43a8219a181a.slice/crio-0ef12de1827360f57e042b13c1c48925bd2a34aef76ab6cf18aff944c7f5f351 WatchSource:0}: Error finding container 0ef12de1827360f57e042b13c1c48925bd2a34aef76ab6cf18aff944c7f5f351: Status 404 returned error can't find the container with id 0ef12de1827360f57e042b13c1c48925bd2a34aef76ab6cf18aff944c7f5f351 Apr 17 16:22:16.026216 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.026140 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:22:16.629193 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.629155 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v5n6c" event={"ID":"4b73a405-15f3-43c5-bf6a-43a8219a181a","Type":"ContainerStarted","Data":"0ef12de1827360f57e042b13c1c48925bd2a34aef76ab6cf18aff944c7f5f351"} Apr 17 16:22:16.900704 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.900616 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc"] Apr 17 16:22:16.903311 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.903292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:16.906015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.905935 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 16:22:16.906015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.905949 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 16:22:16.906015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.905974 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 16:22:16.906015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.905983 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 16:22:16.906015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.905942 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 16:22:16.906341 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.906027 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dp6snf3rtq51k\"" Apr 17 16:22:16.906341 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.905995 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-hm7xd\"" Apr 17 16:22:16.913061 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.913037 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc"] Apr 17 16:22:16.991735 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.991702 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:16.991920 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.991742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-metrics-client-ca\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:16.991920 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.991767 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:16.991920 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.991797 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcdn7\" (UniqueName: \"kubernetes.io/projected/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-kube-api-access-zcdn7\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:16.991920 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.991822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:16.991920 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.991899 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-tls\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:16.992091 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.991974 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:16.992091 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:16.991994 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-grpc-tls\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.092941 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.092901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.092941 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.092947 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcdn7\" (UniqueName: \"kubernetes.io/projected/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-kube-api-access-zcdn7\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.093470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.092967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.093470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.093015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-tls\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.093470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.093055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.093470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.093104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-grpc-tls\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.093470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.093167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.093470 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.093196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-metrics-client-ca\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.094145 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.094118 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-metrics-client-ca\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.095740 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.095711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.095855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.095783 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.096028 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.096006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.096141 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.096128 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-tls\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.096185 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.096165 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.096237 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.096180 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-secret-grpc-tls\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.100488 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.100466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcdn7\" (UniqueName: \"kubernetes.io/projected/058c398d-5b48-483c-bc96-bd8e2f9f3bc3-kube-api-access-zcdn7\") pod \"thanos-querier-64f4f5c6b8-6cdhc\" (UID: \"058c398d-5b48-483c-bc96-bd8e2f9f3bc3\") " pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.212591 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.212502 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:17.334898 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.334866 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc"] Apr 17 16:22:17.338566 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:17.338534 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058c398d_5b48_483c_bc96_bd8e2f9f3bc3.slice/crio-c951ae2a164603c16bbe5b92610a0487a09ebaaeb6a4f77ba6cc34b003564195 WatchSource:0}: Error finding container c951ae2a164603c16bbe5b92610a0487a09ebaaeb6a4f77ba6cc34b003564195: Status 404 returned error can't find the container with id c951ae2a164603c16bbe5b92610a0487a09ebaaeb6a4f77ba6cc34b003564195 Apr 17 16:22:17.637746 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.637707 2569 generic.go:358] "Generic (PLEG): container finished" podID="4b73a405-15f3-43c5-bf6a-43a8219a181a" containerID="fda3ec86dd4c8f3562d806ffb0f5ba9d386809d285638199f876b01a7f28044e" exitCode=0 Apr 17 16:22:17.637959 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.637765 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v5n6c" event={"ID":"4b73a405-15f3-43c5-bf6a-43a8219a181a","Type":"ContainerDied","Data":"fda3ec86dd4c8f3562d806ffb0f5ba9d386809d285638199f876b01a7f28044e"} Apr 17 16:22:17.638933 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:17.638909 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" event={"ID":"058c398d-5b48-483c-bc96-bd8e2f9f3bc3","Type":"ContainerStarted","Data":"c951ae2a164603c16bbe5b92610a0487a09ebaaeb6a4f77ba6cc34b003564195"} Apr 17 16:22:18.643856 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:18.643820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v5n6c" event={"ID":"4b73a405-15f3-43c5-bf6a-43a8219a181a","Type":"ContainerStarted","Data":"a1250be1a1213490b13041f298a69362ee1703550fe162050b02554c9262906d"} Apr 17 16:22:18.644365 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:18.643865 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v5n6c" event={"ID":"4b73a405-15f3-43c5-bf6a-43a8219a181a","Type":"ContainerStarted","Data":"f3b89f1a0dcb65421ab4f32bb3d633c3811252c01d1b4aa97a6908a2d916499e"} Apr 17 16:22:18.663842 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:18.663794 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v5n6c" podStartSLOduration=3.923337526 podStartE2EDuration="4.663778282s" podCreationTimestamp="2026-04-17 16:22:14 +0000 UTC" firstStartedPulling="2026-04-17 16:22:15.853452652 +0000 UTC m=+122.234080993" lastFinishedPulling="2026-04-17 16:22:16.593893404 +0000 UTC m=+122.974521749" observedRunningTime="2026-04-17 16:22:18.6626662 +0000 UTC m=+125.043294563" watchObservedRunningTime="2026-04-17 16:22:18.663778282 +0000 UTC m=+125.044406645" Apr 17 16:22:19.224744 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.224713 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz"] Apr 17 16:22:19.227127 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.227099 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.229509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.229480 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 16:22:19.229509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.229506 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:22:19.229739 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.229480 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 16:22:19.229739 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.229490 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 16:22:19.230509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.230490 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-zc5lp\"" Apr 17 16:22:19.230509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.230505 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2un09t0nnmov\"" Apr 17 16:22:19.237767 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.237743 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz"] Apr 17 16:22:19.313428 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.313393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-metrics-server-audit-profiles\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.313611 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.313442 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-audit-log\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.313611 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.313475 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-secret-metrics-server-client-certs\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.313611 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.313501 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.313611 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.313530 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-client-ca-bundle\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.313611 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.313571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-secret-metrics-server-tls\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.313611 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.313600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjhjd\" (UniqueName: \"kubernetes.io/projected/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-kube-api-access-fjhjd\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.414249 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.414203 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-secret-metrics-server-tls\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.414425 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.414275 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjhjd\" (UniqueName: \"kubernetes.io/projected/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-kube-api-access-fjhjd\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.414425 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.414360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-metrics-server-audit-profiles\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.414425 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.414395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-audit-log\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.414603 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.414428 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-secret-metrics-server-client-certs\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.414603 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.414458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.414603 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.414488 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-client-ca-bundle\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.414845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.414815 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-audit-log\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.415338 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.415317 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-metrics-server-audit-profiles\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.415764 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.415735 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.416815 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.416793 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-client-ca-bundle\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.416896 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.416884 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-secret-metrics-server-tls\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.416992 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.416976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-secret-metrics-server-client-certs\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.421688 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.421668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjhjd\" (UniqueName: \"kubernetes.io/projected/f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2-kube-api-access-fjhjd\") pod \"metrics-server-5fdb8f4b6c-5f6rz\" (UID: \"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2\") " pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.538066 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.537980 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:19.665280 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:19.665252 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz"] Apr 17 16:22:19.669747 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:19.669574 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf59d2ca0_0063_4dfb_bb47_dc1e456cc4b2.slice/crio-07913d9eba269cc116c1533a96f224ef1f1998b3279ca7c95c7f5ea9b807ecce WatchSource:0}: Error finding container 07913d9eba269cc116c1533a96f224ef1f1998b3279ca7c95c7f5ea9b807ecce: Status 404 returned error can't find the container with id 07913d9eba269cc116c1533a96f224ef1f1998b3279ca7c95c7f5ea9b807ecce Apr 17 16:22:20.092604 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.092561 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-78bf64869c-bbggk"] Apr 17 16:22:20.095670 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.095653 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.098106 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.098086 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 16:22:20.098196 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.098146 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-57pr9\"" Apr 17 16:22:20.098196 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.098091 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 16:22:20.098320 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.098255 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 16:22:20.098537 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.098516 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 16:22:20.098661 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.098518 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 16:22:20.103479 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.103352 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 16:22:20.104656 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.104634 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-78bf64869c-bbggk"] Apr 17 16:22:20.222603 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.222557 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-federate-client-tls\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.222603 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.222601 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-metrics-client-ca\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.222845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.222626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-serving-certs-ca-bundle\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.222845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.222715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-secret-telemeter-client\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.222845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.222833 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbnkc\" (UniqueName: \"kubernetes.io/projected/509250ea-5b8f-46b2-9140-e92b0d75346e-kube-api-access-pbnkc\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.223009 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.222923 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.223009 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.222961 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.223009 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.222997 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-telemeter-client-tls\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.325585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.324679 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-federate-client-tls\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.325585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.324730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-metrics-client-ca\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.325585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.324797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-serving-certs-ca-bundle\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.325585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.324876 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-secret-telemeter-client\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.325585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.324980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbnkc\" (UniqueName: \"kubernetes.io/projected/509250ea-5b8f-46b2-9140-e92b0d75346e-kube-api-access-pbnkc\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.325585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.325021 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.325585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.325049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.325585 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.325078 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-telemeter-client-tls\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.326408 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.326373 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-metrics-client-ca\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.326558 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.326532 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-serving-certs-ca-bundle\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.326937 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.326914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509250ea-5b8f-46b2-9140-e92b0d75346e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.328707 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.328660 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-federate-client-tls\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.328707 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.328698 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-secret-telemeter-client\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.328824 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.328769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.328880 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.328834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/509250ea-5b8f-46b2-9140-e92b0d75346e-telemeter-client-tls\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.335577 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.335548 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbnkc\" (UniqueName: \"kubernetes.io/projected/509250ea-5b8f-46b2-9140-e92b0d75346e-kube-api-access-pbnkc\") pod \"telemeter-client-78bf64869c-bbggk\" (UID: \"509250ea-5b8f-46b2-9140-e92b0d75346e\") " pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.406558 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.406475 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" Apr 17 16:22:20.651717 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.651665 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" event={"ID":"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2","Type":"ContainerStarted","Data":"07913d9eba269cc116c1533a96f224ef1f1998b3279ca7c95c7f5ea9b807ecce"} Apr 17 16:22:20.858021 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:20.857968 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-78bf64869c-bbggk"] Apr 17 16:22:20.860177 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:20.860144 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509250ea_5b8f_46b2_9140_e92b0d75346e.slice/crio-7a45008b12f66d40bdd0fbb3c939649716afb70974899bedfa84ef988406e5ca WatchSource:0}: Error finding container 7a45008b12f66d40bdd0fbb3c939649716afb70974899bedfa84ef988406e5ca: Status 404 returned error can't find the container with id 7a45008b12f66d40bdd0fbb3c939649716afb70974899bedfa84ef988406e5ca Apr 17 16:22:21.659172 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:21.659127 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" event={"ID":"058c398d-5b48-483c-bc96-bd8e2f9f3bc3","Type":"ContainerStarted","Data":"648d2b8d17c423e71dd5ad501e20daacab16be1dfb96cd95d1c7a74daa1a993c"} Apr 17 16:22:21.659370 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:21.659176 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" event={"ID":"058c398d-5b48-483c-bc96-bd8e2f9f3bc3","Type":"ContainerStarted","Data":"5385d95ded9428a9c500b33e46776cf58246efd374310da5f8e50277244b35a9"} Apr 17 16:22:21.659370 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:21.659194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" event={"ID":"058c398d-5b48-483c-bc96-bd8e2f9f3bc3","Type":"ContainerStarted","Data":"ca8fed4bfb820d789576825f4e64459ad89883890a5dbbeb4f2f1e1cbc078910"} Apr 17 16:22:21.660664 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:21.660631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" event={"ID":"f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2","Type":"ContainerStarted","Data":"d56d42011b4bf34fba3173c1aa08d20d78c9d44ed9ca4b2674384ada8e5de004"} Apr 17 16:22:21.661744 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:21.661722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" event={"ID":"509250ea-5b8f-46b2-9140-e92b0d75346e","Type":"ContainerStarted","Data":"7a45008b12f66d40bdd0fbb3c939649716afb70974899bedfa84ef988406e5ca"} Apr 17 16:22:21.680419 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:21.680367 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" podStartSLOduration=0.899864099 podStartE2EDuration="2.680352136s" podCreationTimestamp="2026-04-17 16:22:19 +0000 UTC" firstStartedPulling="2026-04-17 16:22:19.673928051 +0000 UTC m=+126.054556391" lastFinishedPulling="2026-04-17 16:22:21.454416076 +0000 UTC m=+127.835044428" observedRunningTime="2026-04-17 16:22:21.678869317 +0000 UTC m=+128.059497672" watchObservedRunningTime="2026-04-17 16:22:21.680352136 +0000 UTC m=+128.060980498" Apr 17 16:22:22.667181 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:22.667150 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" event={"ID":"058c398d-5b48-483c-bc96-bd8e2f9f3bc3","Type":"ContainerStarted","Data":"7f7a7f279be6b1f69680317922a1f47297cfea72fbca0b52e20559b5165eee22"} Apr 17 16:22:22.667181 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:22.667187 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" event={"ID":"058c398d-5b48-483c-bc96-bd8e2f9f3bc3","Type":"ContainerStarted","Data":"2ce460bed411d12311f71df358daa81f7d3d569995ee9481b11e50fc198e9f76"} Apr 17 16:22:22.667619 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:22.667196 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" event={"ID":"058c398d-5b48-483c-bc96-bd8e2f9f3bc3","Type":"ContainerStarted","Data":"82c542baf66742571b6962892a295c0a718e077eef79449873ddc129722ed28e"} Apr 17 16:22:22.693306 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:22.693253 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" podStartSLOduration=2.07994721 podStartE2EDuration="6.693221044s" podCreationTimestamp="2026-04-17 16:22:16 +0000 UTC" firstStartedPulling="2026-04-17 16:22:17.340390718 +0000 UTC m=+123.721019057" lastFinishedPulling="2026-04-17 16:22:21.953664552 +0000 UTC m=+128.334292891" observedRunningTime="2026-04-17 16:22:22.691548965 +0000 UTC m=+129.072177326" watchObservedRunningTime="2026-04-17 16:22:22.693221044 +0000 UTC m=+129.073849406" Apr 17 16:22:23.671755 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:23.671670 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" event={"ID":"509250ea-5b8f-46b2-9140-e92b0d75346e","Type":"ContainerStarted","Data":"5957119bc30a56e972cddd2cc25cd85a7b087ba25db95b609f99f8c8b323f42c"} Apr 17 16:22:23.672132 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:23.672048 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:23.961723 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:23.961630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:22:23.964342 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:23.964314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b74a4398-a3fb-40e5-b014-d968d4c10069-metrics-certs\") pod \"network-metrics-daemon-tfgvs\" (UID: \"b74a4398-a3fb-40e5-b014-d968d4c10069\") " pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:22:24.228713 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:24.228632 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qr2mj\"" Apr 17 16:22:24.236574 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:24.236540 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tfgvs" Apr 17 16:22:24.370398 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:24.370275 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tfgvs"] Apr 17 16:22:24.373061 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:24.373028 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74a4398_a3fb_40e5_b014_d968d4c10069.slice/crio-5b7ffdae4ce11e1983ee4263b3c6cfb2a91aa25818f2dc0a90c2dd93f3a4d421 WatchSource:0}: Error finding container 5b7ffdae4ce11e1983ee4263b3c6cfb2a91aa25818f2dc0a90c2dd93f3a4d421: Status 404 returned error can't find the container with id 5b7ffdae4ce11e1983ee4263b3c6cfb2a91aa25818f2dc0a90c2dd93f3a4d421 Apr 17 16:22:24.675058 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:24.675026 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tfgvs" event={"ID":"b74a4398-a3fb-40e5-b014-d968d4c10069","Type":"ContainerStarted","Data":"5b7ffdae4ce11e1983ee4263b3c6cfb2a91aa25818f2dc0a90c2dd93f3a4d421"} Apr 17 16:22:25.680684 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:25.680652 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" event={"ID":"509250ea-5b8f-46b2-9140-e92b0d75346e","Type":"ContainerStarted","Data":"8dbbde243efac882165762ef4c4331ceb6e1074664897c4776644435554b5087"} Apr 17 16:22:25.680684 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:25.680689 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" event={"ID":"509250ea-5b8f-46b2-9140-e92b0d75346e","Type":"ContainerStarted","Data":"7ee6195deb2745a68b0417d331025eb19391f47077108f6b3129e271a3353ed9"} Apr 17 16:22:25.701337 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:25.701291 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-78bf64869c-bbggk" podStartSLOduration=1.549197553 podStartE2EDuration="5.701277776s" podCreationTimestamp="2026-04-17 16:22:20 +0000 UTC" firstStartedPulling="2026-04-17 16:22:20.864631438 +0000 UTC m=+127.245259786" lastFinishedPulling="2026-04-17 16:22:25.01671167 +0000 UTC m=+131.397340009" observedRunningTime="2026-04-17 16:22:25.699841614 +0000 UTC m=+132.080469990" watchObservedRunningTime="2026-04-17 16:22:25.701277776 +0000 UTC m=+132.081906136" Apr 17 16:22:26.685274 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:26.685220 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tfgvs" event={"ID":"b74a4398-a3fb-40e5-b014-d968d4c10069","Type":"ContainerStarted","Data":"84830afec6784be52f90e1a5292703bbecd3f0a8b126263b09190550d7010306"} Apr 17 16:22:26.685274 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:26.685279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tfgvs" event={"ID":"b74a4398-a3fb-40e5-b014-d968d4c10069","Type":"ContainerStarted","Data":"44355cddb5a96b5149aade05a0b41bf179eafc7df1299c914a14ed923739f20d"} Apr 17 16:22:26.700113 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:26.700055 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tfgvs" podStartSLOduration=131.063837862 podStartE2EDuration="2m12.700040345s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:22:24.375039733 +0000 UTC m=+130.755668072" lastFinishedPulling="2026-04-17 16:22:26.011242202 +0000 UTC m=+132.391870555" observedRunningTime="2026-04-17 16:22:26.698767204 +0000 UTC m=+133.079395567" watchObservedRunningTime="2026-04-17 16:22:26.700040345 +0000 UTC m=+133.080668755" Apr 17 16:22:29.681534 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:29.681504 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-64f4f5c6b8-6cdhc" Apr 17 16:22:31.038337 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.038289 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" podUID="c86881cb-f096-4917-ba77-b03ea33790c7" containerName="registry" containerID="cri-o://4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9" gracePeriod=30 Apr 17 16:22:31.273724 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.273694 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:22:31.319495 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.319470 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5zvw\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-kube-api-access-t5zvw\") pod \"c86881cb-f096-4917-ba77-b03ea33790c7\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " Apr 17 16:22:31.319627 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.319521 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-bound-sa-token\") pod \"c86881cb-f096-4917-ba77-b03ea33790c7\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " Apr 17 16:22:31.319627 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.319551 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-image-registry-private-configuration\") pod \"c86881cb-f096-4917-ba77-b03ea33790c7\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " Apr 17 16:22:31.319627 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.319566 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") pod \"c86881cb-f096-4917-ba77-b03ea33790c7\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " Apr 17 16:22:31.319627 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.319593 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c86881cb-f096-4917-ba77-b03ea33790c7-ca-trust-extracted\") pod \"c86881cb-f096-4917-ba77-b03ea33790c7\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " Apr 17 16:22:31.319627 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.319619 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-registry-certificates\") pod \"c86881cb-f096-4917-ba77-b03ea33790c7\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " Apr 17 16:22:31.319883 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.319665 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-trusted-ca\") pod \"c86881cb-f096-4917-ba77-b03ea33790c7\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " Apr 17 16:22:31.319883 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.319715 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-installation-pull-secrets\") pod \"c86881cb-f096-4917-ba77-b03ea33790c7\" (UID: \"c86881cb-f096-4917-ba77-b03ea33790c7\") " Apr 17 16:22:31.320313 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.320114 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c86881cb-f096-4917-ba77-b03ea33790c7" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:22:31.320456 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.320389 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c86881cb-f096-4917-ba77-b03ea33790c7" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:22:31.321986 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.321956 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c86881cb-f096-4917-ba77-b03ea33790c7" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:22:31.322082 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.322035 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c86881cb-f096-4917-ba77-b03ea33790c7" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:22:31.322279 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.322250 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c86881cb-f096-4917-ba77-b03ea33790c7" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:22:31.322279 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.322266 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-kube-api-access-t5zvw" (OuterVolumeSpecName: "kube-api-access-t5zvw") pod "c86881cb-f096-4917-ba77-b03ea33790c7" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7"). InnerVolumeSpecName "kube-api-access-t5zvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:22:31.322413 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.322255 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c86881cb-f096-4917-ba77-b03ea33790c7" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:22:31.328477 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.328451 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86881cb-f096-4917-ba77-b03ea33790c7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c86881cb-f096-4917-ba77-b03ea33790c7" (UID: "c86881cb-f096-4917-ba77-b03ea33790c7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:22:31.420925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.420768 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-registry-certificates\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:22:31.420925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.420818 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c86881cb-f096-4917-ba77-b03ea33790c7-trusted-ca\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:22:31.420925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.420835 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-installation-pull-secrets\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:22:31.420925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.420849 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t5zvw\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-kube-api-access-t5zvw\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:22:31.420925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.420866 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-bound-sa-token\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:22:31.424595 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.420887 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c86881cb-f096-4917-ba77-b03ea33790c7-image-registry-private-configuration\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:22:31.424595 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.421372 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c86881cb-f096-4917-ba77-b03ea33790c7-registry-tls\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:22:31.424595 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.421394 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c86881cb-f096-4917-ba77-b03ea33790c7-ca-trust-extracted\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:22:31.701643 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.701558 2569 generic.go:358] "Generic (PLEG): container finished" podID="c86881cb-f096-4917-ba77-b03ea33790c7" containerID="4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9" exitCode=0 Apr 17 16:22:31.701643 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.701606 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" event={"ID":"c86881cb-f096-4917-ba77-b03ea33790c7","Type":"ContainerDied","Data":"4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9"} Apr 17 16:22:31.701643 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.701627 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" event={"ID":"c86881cb-f096-4917-ba77-b03ea33790c7","Type":"ContainerDied","Data":"460dc60e1cd5eba2f6756ef574aeb9a837e9f092dae0a6301425b9682545e8f2"} Apr 17 16:22:31.701643 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.701642 2569 scope.go:117] "RemoveContainer" containerID="4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9" Apr 17 16:22:31.701919 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.701647 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6988f59bd7-zttj5" Apr 17 16:22:31.710166 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.710147 2569 scope.go:117] "RemoveContainer" containerID="4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9" Apr 17 16:22:31.710463 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:22:31.710443 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9\": container with ID starting with 4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9 not found: ID does not exist" containerID="4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9" Apr 17 16:22:31.710504 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.710474 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9"} err="failed to get container status \"4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9\": rpc error: code = NotFound desc = could not find container \"4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9\": container with ID starting with 4893ba38af9bcb859510796eaeeb97432fae7791bbc36a883f53b943036234e9 not found: ID does not exist" Apr 17 16:22:31.722526 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.722502 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6988f59bd7-zttj5"] Apr 17 16:22:31.725613 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:31.725592 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6988f59bd7-zttj5"] Apr 17 16:22:32.215895 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.215861 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86881cb-f096-4917-ba77-b03ea33790c7" path="/var/lib/kubelet/pods/c86881cb-f096-4917-ba77-b03ea33790c7/volumes" Apr 17 16:22:32.864621 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.864588 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-kkz7m"] Apr 17 16:22:32.864887 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.864876 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c86881cb-f096-4917-ba77-b03ea33790c7" containerName="registry" Apr 17 16:22:32.864940 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.864889 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86881cb-f096-4917-ba77-b03ea33790c7" containerName="registry" Apr 17 16:22:32.864940 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.864935 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c86881cb-f096-4917-ba77-b03ea33790c7" containerName="registry" Apr 17 16:22:32.868446 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.868423 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kkz7m" Apr 17 16:22:32.870865 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.870839 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5fzgz\"" Apr 17 16:22:32.871352 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.871101 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:22:32.871648 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.871148 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:22:32.877981 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.877958 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kkz7m"] Apr 17 16:22:32.931768 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:32.931725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8rwf\" (UniqueName: \"kubernetes.io/projected/194f3893-97c5-431f-b1a4-de3bc7c0dcbe-kube-api-access-k8rwf\") pod \"downloads-6bcc868b7-kkz7m\" (UID: \"194f3893-97c5-431f-b1a4-de3bc7c0dcbe\") " pod="openshift-console/downloads-6bcc868b7-kkz7m" Apr 17 16:22:33.032994 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:33.032960 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8rwf\" (UniqueName: \"kubernetes.io/projected/194f3893-97c5-431f-b1a4-de3bc7c0dcbe-kube-api-access-k8rwf\") pod \"downloads-6bcc868b7-kkz7m\" (UID: \"194f3893-97c5-431f-b1a4-de3bc7c0dcbe\") " pod="openshift-console/downloads-6bcc868b7-kkz7m" Apr 17 16:22:33.043391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:33.043365 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8rwf\" (UniqueName: \"kubernetes.io/projected/194f3893-97c5-431f-b1a4-de3bc7c0dcbe-kube-api-access-k8rwf\") pod \"downloads-6bcc868b7-kkz7m\" (UID: \"194f3893-97c5-431f-b1a4-de3bc7c0dcbe\") " pod="openshift-console/downloads-6bcc868b7-kkz7m" Apr 17 16:22:33.178923 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:33.178824 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kkz7m" Apr 17 16:22:33.315871 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:33.315828 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kkz7m"] Apr 17 16:22:33.320090 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:33.320050 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194f3893_97c5_431f_b1a4_de3bc7c0dcbe.slice/crio-88e1f8bbc2c64e1491af470f57bcac30531886f3f9397adc3b087c9cf2c431fe WatchSource:0}: Error finding container 88e1f8bbc2c64e1491af470f57bcac30531886f3f9397adc3b087c9cf2c431fe: Status 404 returned error can't find the container with id 88e1f8bbc2c64e1491af470f57bcac30531886f3f9397adc3b087c9cf2c431fe Apr 17 16:22:33.708634 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:33.708600 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kkz7m" event={"ID":"194f3893-97c5-431f-b1a4-de3bc7c0dcbe","Type":"ContainerStarted","Data":"88e1f8bbc2c64e1491af470f57bcac30531886f3f9397adc3b087c9cf2c431fe"} Apr 17 16:22:39.538599 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:39.538568 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:39.538953 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:39.538643 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:41.904679 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.904639 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-698b66b7d8-62bl7"] Apr 17 16:22:41.908536 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.908512 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:41.912391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.912310 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:22:41.912391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.912310 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-rc9lb\"" Apr 17 16:22:41.912391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.912312 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:22:41.912648 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.912330 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:22:41.912648 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.912330 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:22:41.912648 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.912365 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:22:41.919200 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:41.919165 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-698b66b7d8-62bl7"] Apr 17 16:22:42.006825 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.006796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-oauth-config\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.007021 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.006862 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28x4g\" (UniqueName: \"kubernetes.io/projected/7a720dc8-a468-4726-93c7-7ed242ef3b52-kube-api-access-28x4g\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.007021 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.006965 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-config\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.007021 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.007002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-serving-cert\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.007159 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.007031 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-service-ca\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.007159 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.007071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-oauth-serving-cert\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.108384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.108341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28x4g\" (UniqueName: \"kubernetes.io/projected/7a720dc8-a468-4726-93c7-7ed242ef3b52-kube-api-access-28x4g\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.108561 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.108416 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-config\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.108561 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.108451 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-serving-cert\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.108561 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.108476 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-service-ca\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.108561 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.108503 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-oauth-serving-cert\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.108561 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.108557 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-oauth-config\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.109150 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.109130 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-service-ca\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.109274 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.109196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-config\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.109349 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.109332 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-oauth-serving-cert\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.111374 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.111345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-oauth-config\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.111599 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.111579 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-serving-cert\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.116104 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.116074 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28x4g\" (UniqueName: \"kubernetes.io/projected/7a720dc8-a468-4726-93c7-7ed242ef3b52-kube-api-access-28x4g\") pod \"console-698b66b7d8-62bl7\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.221295 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.221203 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:42.358571 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.358540 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-698b66b7d8-62bl7"] Apr 17 16:22:42.361303 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:42.361266 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a720dc8_a468_4726_93c7_7ed242ef3b52.slice/crio-e8fb91f6a21f4ba03e6596634660f7323a71065b804296506b9e65a39b70c87a WatchSource:0}: Error finding container e8fb91f6a21f4ba03e6596634660f7323a71065b804296506b9e65a39b70c87a: Status 404 returned error can't find the container with id e8fb91f6a21f4ba03e6596634660f7323a71065b804296506b9e65a39b70c87a Apr 17 16:22:42.740403 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:42.740359 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-698b66b7d8-62bl7" event={"ID":"7a720dc8-a468-4726-93c7-7ed242ef3b52","Type":"ContainerStarted","Data":"e8fb91f6a21f4ba03e6596634660f7323a71065b804296506b9e65a39b70c87a"} Apr 17 16:22:45.751323 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:45.751286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-698b66b7d8-62bl7" event={"ID":"7a720dc8-a468-4726-93c7-7ed242ef3b52","Type":"ContainerStarted","Data":"4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2"} Apr 17 16:22:45.766685 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:45.766621 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-698b66b7d8-62bl7" podStartSLOduration=1.596015797 podStartE2EDuration="4.766605909s" podCreationTimestamp="2026-04-17 16:22:41 +0000 UTC" firstStartedPulling="2026-04-17 16:22:42.363676197 +0000 UTC m=+148.744304538" lastFinishedPulling="2026-04-17 16:22:45.534266308 +0000 UTC m=+151.914894650" observedRunningTime="2026-04-17 16:22:45.766122202 +0000 UTC m=+152.146750563" watchObservedRunningTime="2026-04-17 16:22:45.766605909 +0000 UTC m=+152.147234269" Apr 17 16:22:49.565730 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.565695 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-857568656c-d4bwx"] Apr 17 16:22:49.569175 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.569144 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.578505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.578477 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:22:49.584355 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.584329 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857568656c-d4bwx"] Apr 17 16:22:49.677365 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.677335 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-service-ca\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.677570 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.677470 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-oauth-serving-cert\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.677570 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.677524 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-config\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.677570 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.677551 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jksbk\" (UniqueName: \"kubernetes.io/projected/d04935fc-fe67-479e-bbee-3e2f78b7488c-kube-api-access-jksbk\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.677725 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.677626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-trusted-ca-bundle\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.677725 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.677679 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-oauth-config\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.677725 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.677719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-serving-cert\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.778997 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.778963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-serving-cert\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.779186 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.779006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-service-ca\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.779186 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.779106 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-oauth-serving-cert\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.779186 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.779140 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-config\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.779186 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.779162 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jksbk\" (UniqueName: \"kubernetes.io/projected/d04935fc-fe67-479e-bbee-3e2f78b7488c-kube-api-access-jksbk\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.779463 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.779197 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-trusted-ca-bundle\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.779463 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.779296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-oauth-config\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.779964 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.779937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-service-ca\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.780285 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.780262 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-oauth-serving-cert\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.780346 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.780271 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-config\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.780502 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.780480 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-trusted-ca-bundle\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.782009 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.781984 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-serving-cert\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.782123 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.782049 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-oauth-config\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.787079 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.787053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jksbk\" (UniqueName: \"kubernetes.io/projected/d04935fc-fe67-479e-bbee-3e2f78b7488c-kube-api-access-jksbk\") pod \"console-857568656c-d4bwx\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:49.880701 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:49.880610 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:52.222204 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:52.222172 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:52.222672 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:52.222271 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:52.227464 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:52.227434 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:52.777886 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:52.777851 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:22:53.530625 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:53.530590 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857568656c-d4bwx"] Apr 17 16:22:53.625304 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:22:53.625270 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd04935fc_fe67_479e_bbee_3e2f78b7488c.slice/crio-39222b775ce24aa46de764965e1bb3d75709e38a2b7413ee338c16a10073b578 WatchSource:0}: Error finding container 39222b775ce24aa46de764965e1bb3d75709e38a2b7413ee338c16a10073b578: Status 404 returned error can't find the container with id 39222b775ce24aa46de764965e1bb3d75709e38a2b7413ee338c16a10073b578 Apr 17 16:22:53.777620 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:53.777578 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857568656c-d4bwx" event={"ID":"d04935fc-fe67-479e-bbee-3e2f78b7488c","Type":"ContainerStarted","Data":"39222b775ce24aa46de764965e1bb3d75709e38a2b7413ee338c16a10073b578"} Apr 17 16:22:53.779095 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:53.779069 2569 generic.go:358] "Generic (PLEG): container finished" podID="0ae9cb04-c427-4571-bac6-8e89f37be1c0" containerID="b9cc289ef714c754405b7e4a0bc05f5bb063f232852033e7311111c56ccb55a9" exitCode=0 Apr 17 16:22:53.779266 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:53.779158 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" event={"ID":"0ae9cb04-c427-4571-bac6-8e89f37be1c0","Type":"ContainerDied","Data":"b9cc289ef714c754405b7e4a0bc05f5bb063f232852033e7311111c56ccb55a9"} Apr 17 16:22:53.779626 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:53.779610 2569 scope.go:117] "RemoveContainer" containerID="b9cc289ef714c754405b7e4a0bc05f5bb063f232852033e7311111c56ccb55a9" Apr 17 16:22:54.784894 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:54.784804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kkz7m" event={"ID":"194f3893-97c5-431f-b1a4-de3bc7c0dcbe","Type":"ContainerStarted","Data":"b9e96ccd3cee5a4bc72519f98d97bab8685bee6cecad5908a8bd15e0ab09482c"} Apr 17 16:22:54.785366 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:54.785043 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-kkz7m" Apr 17 16:22:54.786701 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:54.786666 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857568656c-d4bwx" event={"ID":"d04935fc-fe67-479e-bbee-3e2f78b7488c","Type":"ContainerStarted","Data":"fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988"} Apr 17 16:22:54.789076 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:54.789045 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-mm8ns" event={"ID":"0ae9cb04-c427-4571-bac6-8e89f37be1c0","Type":"ContainerStarted","Data":"01ec33e84984e7966877373d9c9d44727462e850fbbf8b8fed7ec2c7460573cc"} Apr 17 16:22:54.802716 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:54.802659 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-kkz7m" podStartSLOduration=2.432701941 podStartE2EDuration="22.802640043s" podCreationTimestamp="2026-04-17 16:22:32 +0000 UTC" firstStartedPulling="2026-04-17 16:22:33.324275695 +0000 UTC m=+139.704904049" lastFinishedPulling="2026-04-17 16:22:53.694213812 +0000 UTC m=+160.074842151" observedRunningTime="2026-04-17 16:22:54.800512573 +0000 UTC m=+161.181140995" watchObservedRunningTime="2026-04-17 16:22:54.802640043 +0000 UTC m=+161.183268404" Apr 17 16:22:54.804506 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:54.804477 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-kkz7m" Apr 17 16:22:54.816370 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:54.816315 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-857568656c-d4bwx" podStartSLOduration=5.816298198 podStartE2EDuration="5.816298198s" podCreationTimestamp="2026-04-17 16:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:22:54.815094726 +0000 UTC m=+161.195723146" watchObservedRunningTime="2026-04-17 16:22:54.816298198 +0000 UTC m=+161.196926558" Apr 17 16:22:59.418980 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:59.418945 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-h5d4m_dce4e627-2afb-4861-8b1a-4bf531c0f4a7/serve-healthcheck-canary/0.log" Apr 17 16:22:59.544455 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:59.544423 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:59.549064 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:59.549036 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5fdb8f4b6c-5f6rz" Apr 17 16:22:59.881361 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:59.881322 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:59.881361 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:59.881365 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:22:59.886862 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:22:59.886834 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:23:00.813689 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:00.813656 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:23:00.857005 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:00.856967 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-698b66b7d8-62bl7"] Apr 17 16:23:25.880924 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:25.880849 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-698b66b7d8-62bl7" podUID="7a720dc8-a468-4726-93c7-7ed242ef3b52" containerName="console" containerID="cri-o://4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2" gracePeriod=15 Apr 17 16:23:26.132463 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.132406 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-698b66b7d8-62bl7_7a720dc8-a468-4726-93c7-7ed242ef3b52/console/0.log" Apr 17 16:23:26.132574 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.132467 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:23:26.213546 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.213518 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-oauth-config\") pod \"7a720dc8-a468-4726-93c7-7ed242ef3b52\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " Apr 17 16:23:26.213546 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.213551 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-serving-cert\") pod \"7a720dc8-a468-4726-93c7-7ed242ef3b52\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " Apr 17 16:23:26.213790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.213570 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-oauth-serving-cert\") pod \"7a720dc8-a468-4726-93c7-7ed242ef3b52\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " Apr 17 16:23:26.213790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.213595 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28x4g\" (UniqueName: \"kubernetes.io/projected/7a720dc8-a468-4726-93c7-7ed242ef3b52-kube-api-access-28x4g\") pod \"7a720dc8-a468-4726-93c7-7ed242ef3b52\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " Apr 17 16:23:26.213790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.213614 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-service-ca\") pod \"7a720dc8-a468-4726-93c7-7ed242ef3b52\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " Apr 17 16:23:26.213790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.213662 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-config\") pod \"7a720dc8-a468-4726-93c7-7ed242ef3b52\" (UID: \"7a720dc8-a468-4726-93c7-7ed242ef3b52\") " Apr 17 16:23:26.214151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.214092 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-service-ca" (OuterVolumeSpecName: "service-ca") pod "7a720dc8-a468-4726-93c7-7ed242ef3b52" (UID: "7a720dc8-a468-4726-93c7-7ed242ef3b52"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:26.214151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.214091 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7a720dc8-a468-4726-93c7-7ed242ef3b52" (UID: "7a720dc8-a468-4726-93c7-7ed242ef3b52"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:26.214355 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.214195 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-config" (OuterVolumeSpecName: "console-config") pod "7a720dc8-a468-4726-93c7-7ed242ef3b52" (UID: "7a720dc8-a468-4726-93c7-7ed242ef3b52"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:26.215906 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.215879 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7a720dc8-a468-4726-93c7-7ed242ef3b52" (UID: "7a720dc8-a468-4726-93c7-7ed242ef3b52"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:26.216184 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.216166 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7a720dc8-a468-4726-93c7-7ed242ef3b52" (UID: "7a720dc8-a468-4726-93c7-7ed242ef3b52"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:26.216380 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.216360 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a720dc8-a468-4726-93c7-7ed242ef3b52-kube-api-access-28x4g" (OuterVolumeSpecName: "kube-api-access-28x4g") pod "7a720dc8-a468-4726-93c7-7ed242ef3b52" (UID: "7a720dc8-a468-4726-93c7-7ed242ef3b52"). InnerVolumeSpecName "kube-api-access-28x4g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:23:26.314494 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.314458 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-oauth-config\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:23:26.314494 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.314487 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-serving-cert\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:23:26.314494 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.314496 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-oauth-serving-cert\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:23:26.314494 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.314506 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28x4g\" (UniqueName: \"kubernetes.io/projected/7a720dc8-a468-4726-93c7-7ed242ef3b52-kube-api-access-28x4g\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:23:26.314758 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.314517 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-service-ca\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:23:26.314758 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.314526 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7a720dc8-a468-4726-93c7-7ed242ef3b52-console-config\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:23:26.888119 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.888092 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-698b66b7d8-62bl7_7a720dc8-a468-4726-93c7-7ed242ef3b52/console/0.log" Apr 17 16:23:26.888565 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.888142 2569 generic.go:358] "Generic (PLEG): container finished" podID="7a720dc8-a468-4726-93c7-7ed242ef3b52" containerID="4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2" exitCode=2 Apr 17 16:23:26.888565 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.888179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-698b66b7d8-62bl7" event={"ID":"7a720dc8-a468-4726-93c7-7ed242ef3b52","Type":"ContainerDied","Data":"4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2"} Apr 17 16:23:26.888565 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.888217 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-698b66b7d8-62bl7" Apr 17 16:23:26.888565 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.888245 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-698b66b7d8-62bl7" event={"ID":"7a720dc8-a468-4726-93c7-7ed242ef3b52","Type":"ContainerDied","Data":"e8fb91f6a21f4ba03e6596634660f7323a71065b804296506b9e65a39b70c87a"} Apr 17 16:23:26.888565 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.888257 2569 scope.go:117] "RemoveContainer" containerID="4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2" Apr 17 16:23:26.896306 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.896289 2569 scope.go:117] "RemoveContainer" containerID="4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2" Apr 17 16:23:26.896568 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:23:26.896548 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2\": container with ID starting with 4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2 not found: ID does not exist" containerID="4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2" Apr 17 16:23:26.896618 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.896577 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2"} err="failed to get container status \"4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2\": rpc error: code = NotFound desc = could not find container \"4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2\": container with ID starting with 4d317082da6c6e53dd43a28957db1bd2fd4bd9b32d64bf4c603575bc7a7bf2b2 not found: ID does not exist" Apr 17 16:23:26.903149 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.903124 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-698b66b7d8-62bl7"] Apr 17 16:23:26.906633 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:26.906611 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-698b66b7d8-62bl7"] Apr 17 16:23:28.216380 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:28.216352 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a720dc8-a468-4726-93c7-7ed242ef3b52" path="/var/lib/kubelet/pods/7a720dc8-a468-4726-93c7-7ed242ef3b52/volumes" Apr 17 16:23:32.872632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.872597 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f87b67c4b-29cns"] Apr 17 16:23:32.873008 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.872894 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a720dc8-a468-4726-93c7-7ed242ef3b52" containerName="console" Apr 17 16:23:32.873008 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.872905 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a720dc8-a468-4726-93c7-7ed242ef3b52" containerName="console" Apr 17 16:23:32.873008 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.872953 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a720dc8-a468-4726-93c7-7ed242ef3b52" containerName="console" Apr 17 16:23:32.910074 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.910049 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f87b67c4b-29cns"] Apr 17 16:23:32.910249 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.910152 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:32.977591 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.977550 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-service-ca\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:32.977591 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.977593 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-trusted-ca-bundle\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:32.977778 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.977657 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-console-config\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:32.977778 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.977698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-oauth-serving-cert\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:32.977778 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.977725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-oauth-config\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:32.977778 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.977748 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhz4v\" (UniqueName: \"kubernetes.io/projected/2cb33870-85ad-4318-b170-0110affd63f8-kube-api-access-fhz4v\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:32.977920 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:32.977767 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-serving-cert\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.078835 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.078796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-oauth-config\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.078844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhz4v\" (UniqueName: \"kubernetes.io/projected/2cb33870-85ad-4318-b170-0110affd63f8-kube-api-access-fhz4v\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.078883 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-serving-cert\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.078946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-service-ca\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.078980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-trusted-ca-bundle\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.079020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-console-config\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079321 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.079049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-oauth-serving-cert\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079691 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.079666 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-service-ca\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079788 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.079706 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-console-config\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.079788 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.079743 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-oauth-serving-cert\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.080043 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.080019 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-trusted-ca-bundle\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.081459 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.081443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-oauth-config\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.081641 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.081625 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-serving-cert\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.085853 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.085829 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhz4v\" (UniqueName: \"kubernetes.io/projected/2cb33870-85ad-4318-b170-0110affd63f8-kube-api-access-fhz4v\") pod \"console-5f87b67c4b-29cns\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.219437 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.219355 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:33.338687 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.338659 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f87b67c4b-29cns"] Apr 17 16:23:33.341514 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:23:33.341486 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cb33870_85ad_4318_b170_0110affd63f8.slice/crio-36318d0f607eac48156bf8c2a43d08cf4cbeb4ae759a084f73748064296b858f WatchSource:0}: Error finding container 36318d0f607eac48156bf8c2a43d08cf4cbeb4ae759a084f73748064296b858f: Status 404 returned error can't find the container with id 36318d0f607eac48156bf8c2a43d08cf4cbeb4ae759a084f73748064296b858f Apr 17 16:23:33.910170 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.910138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f87b67c4b-29cns" event={"ID":"2cb33870-85ad-4318-b170-0110affd63f8","Type":"ContainerStarted","Data":"1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa"} Apr 17 16:23:33.910170 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.910175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f87b67c4b-29cns" event={"ID":"2cb33870-85ad-4318-b170-0110affd63f8","Type":"ContainerStarted","Data":"36318d0f607eac48156bf8c2a43d08cf4cbeb4ae759a084f73748064296b858f"} Apr 17 16:23:33.934800 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:33.934747 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f87b67c4b-29cns" podStartSLOduration=1.9347293890000001 podStartE2EDuration="1.934729389s" podCreationTimestamp="2026-04-17 16:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:23:33.933289805 +0000 UTC m=+200.313918166" watchObservedRunningTime="2026-04-17 16:23:33.934729389 +0000 UTC m=+200.315357752" Apr 17 16:23:43.220299 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:43.220261 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:43.220299 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:43.220309 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:43.224832 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:43.224807 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:43.942820 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:43.942794 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:23:43.996742 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:23:43.996709 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-857568656c-d4bwx"] Apr 17 16:24:09.019243 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.019163 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-857568656c-d4bwx" podUID="d04935fc-fe67-479e-bbee-3e2f78b7488c" containerName="console" containerID="cri-o://fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988" gracePeriod=15 Apr 17 16:24:09.257666 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.257644 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857568656c-d4bwx_d04935fc-fe67-479e-bbee-3e2f78b7488c/console/0.log" Apr 17 16:24:09.257787 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.257702 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:24:09.366523 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.366491 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-config\") pod \"d04935fc-fe67-479e-bbee-3e2f78b7488c\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " Apr 17 16:24:09.366691 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.366532 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-oauth-serving-cert\") pod \"d04935fc-fe67-479e-bbee-3e2f78b7488c\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " Apr 17 16:24:09.366691 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.366591 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-oauth-config\") pod \"d04935fc-fe67-479e-bbee-3e2f78b7488c\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " Apr 17 16:24:09.366691 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.366615 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-service-ca\") pod \"d04935fc-fe67-479e-bbee-3e2f78b7488c\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " Apr 17 16:24:09.366691 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.366637 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-serving-cert\") pod \"d04935fc-fe67-479e-bbee-3e2f78b7488c\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " Apr 17 16:24:09.366874 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.366832 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-trusted-ca-bundle\") pod \"d04935fc-fe67-479e-bbee-3e2f78b7488c\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " Apr 17 16:24:09.366928 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.366892 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jksbk\" (UniqueName: \"kubernetes.io/projected/d04935fc-fe67-479e-bbee-3e2f78b7488c-kube-api-access-jksbk\") pod \"d04935fc-fe67-479e-bbee-3e2f78b7488c\" (UID: \"d04935fc-fe67-479e-bbee-3e2f78b7488c\") " Apr 17 16:24:09.367076 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.366990 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-config" (OuterVolumeSpecName: "console-config") pod "d04935fc-fe67-479e-bbee-3e2f78b7488c" (UID: "d04935fc-fe67-479e-bbee-3e2f78b7488c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:24:09.367158 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.367086 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d04935fc-fe67-479e-bbee-3e2f78b7488c" (UID: "d04935fc-fe67-479e-bbee-3e2f78b7488c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:24:09.367158 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.367098 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-service-ca" (OuterVolumeSpecName: "service-ca") pod "d04935fc-fe67-479e-bbee-3e2f78b7488c" (UID: "d04935fc-fe67-479e-bbee-3e2f78b7488c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:24:09.367295 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.367170 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-config\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:24:09.367295 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.367187 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d04935fc-fe67-479e-bbee-3e2f78b7488c" (UID: "d04935fc-fe67-479e-bbee-3e2f78b7488c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:24:09.368903 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.368878 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d04935fc-fe67-479e-bbee-3e2f78b7488c" (UID: "d04935fc-fe67-479e-bbee-3e2f78b7488c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:24:09.368903 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.368893 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04935fc-fe67-479e-bbee-3e2f78b7488c-kube-api-access-jksbk" (OuterVolumeSpecName: "kube-api-access-jksbk") pod "d04935fc-fe67-479e-bbee-3e2f78b7488c" (UID: "d04935fc-fe67-479e-bbee-3e2f78b7488c"). InnerVolumeSpecName "kube-api-access-jksbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:24:09.369026 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.368924 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d04935fc-fe67-479e-bbee-3e2f78b7488c" (UID: "d04935fc-fe67-479e-bbee-3e2f78b7488c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:24:09.467649 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.467602 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-trusted-ca-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:24:09.467649 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.467639 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jksbk\" (UniqueName: \"kubernetes.io/projected/d04935fc-fe67-479e-bbee-3e2f78b7488c-kube-api-access-jksbk\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:24:09.467649 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.467650 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-oauth-serving-cert\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:24:09.467649 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.467660 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-oauth-config\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:24:09.467909 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.467670 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d04935fc-fe67-479e-bbee-3e2f78b7488c-service-ca\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:24:09.467909 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:09.467678 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d04935fc-fe67-479e-bbee-3e2f78b7488c-console-serving-cert\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:24:10.013429 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.013401 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857568656c-d4bwx_d04935fc-fe67-479e-bbee-3e2f78b7488c/console/0.log" Apr 17 16:24:10.013601 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.013442 2569 generic.go:358] "Generic (PLEG): container finished" podID="d04935fc-fe67-479e-bbee-3e2f78b7488c" containerID="fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988" exitCode=2 Apr 17 16:24:10.013601 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.013513 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857568656c-d4bwx" Apr 17 16:24:10.013601 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.013531 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857568656c-d4bwx" event={"ID":"d04935fc-fe67-479e-bbee-3e2f78b7488c","Type":"ContainerDied","Data":"fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988"} Apr 17 16:24:10.013601 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.013568 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857568656c-d4bwx" event={"ID":"d04935fc-fe67-479e-bbee-3e2f78b7488c","Type":"ContainerDied","Data":"39222b775ce24aa46de764965e1bb3d75709e38a2b7413ee338c16a10073b578"} Apr 17 16:24:10.013601 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.013588 2569 scope.go:117] "RemoveContainer" containerID="fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988" Apr 17 16:24:10.022513 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.022374 2569 scope.go:117] "RemoveContainer" containerID="fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988" Apr 17 16:24:10.022731 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:24:10.022668 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988\": container with ID starting with fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988 not found: ID does not exist" containerID="fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988" Apr 17 16:24:10.022731 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.022691 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988"} err="failed to get container status \"fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988\": rpc error: code = NotFound desc = could not find container \"fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988\": container with ID starting with fe7c7624d8bae06758c9d3c1c004ed2fc2ffad88ddaed2be6466762beaa28988 not found: ID does not exist" Apr 17 16:24:10.034809 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.034779 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-857568656c-d4bwx"] Apr 17 16:24:10.040391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.040361 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-857568656c-d4bwx"] Apr 17 16:24:10.216304 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:10.216270 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04935fc-fe67-479e-bbee-3e2f78b7488c" path="/var/lib/kubelet/pods/d04935fc-fe67-479e-bbee-3e2f78b7488c/volumes" Apr 17 16:24:15.993664 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:15.993627 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-b6drk"] Apr 17 16:24:15.994015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:15.993920 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d04935fc-fe67-479e-bbee-3e2f78b7488c" containerName="console" Apr 17 16:24:15.994015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:15.993930 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04935fc-fe67-479e-bbee-3e2f78b7488c" containerName="console" Apr 17 16:24:15.994015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:15.993990 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d04935fc-fe67-479e-bbee-3e2f78b7488c" containerName="console" Apr 17 16:24:15.998343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:15.998325 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.000650 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.000626 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:24:16.003046 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.003022 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-b6drk"] Apr 17 16:24:16.015801 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.015776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/338e932a-1f5c-4a6b-8be1-288700fd3608-kubelet-config\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.015909 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.015807 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/338e932a-1f5c-4a6b-8be1-288700fd3608-original-pull-secret\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.015909 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.015852 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/338e932a-1f5c-4a6b-8be1-288700fd3608-dbus\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.117078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.117037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/338e932a-1f5c-4a6b-8be1-288700fd3608-dbus\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.117304 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.117104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/338e932a-1f5c-4a6b-8be1-288700fd3608-kubelet-config\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.117304 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.117121 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/338e932a-1f5c-4a6b-8be1-288700fd3608-original-pull-secret\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.117304 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.117221 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/338e932a-1f5c-4a6b-8be1-288700fd3608-dbus\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.117304 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.117268 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/338e932a-1f5c-4a6b-8be1-288700fd3608-kubelet-config\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.119378 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.119359 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/338e932a-1f5c-4a6b-8be1-288700fd3608-original-pull-secret\") pod \"global-pull-secret-syncer-b6drk\" (UID: \"338e932a-1f5c-4a6b-8be1-288700fd3608\") " pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.308198 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.308093 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-b6drk" Apr 17 16:24:16.424064 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:16.424027 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-b6drk"] Apr 17 16:24:16.426671 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:24:16.426646 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod338e932a_1f5c_4a6b_8be1_288700fd3608.slice/crio-beebf84d6b79b35e54eda52a2412601da7317528b8cdc168410a783116ed05b3 WatchSource:0}: Error finding container beebf84d6b79b35e54eda52a2412601da7317528b8cdc168410a783116ed05b3: Status 404 returned error can't find the container with id beebf84d6b79b35e54eda52a2412601da7317528b8cdc168410a783116ed05b3 Apr 17 16:24:17.034647 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:17.034598 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-b6drk" event={"ID":"338e932a-1f5c-4a6b-8be1-288700fd3608","Type":"ContainerStarted","Data":"beebf84d6b79b35e54eda52a2412601da7317528b8cdc168410a783116ed05b3"} Apr 17 16:24:23.052519 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:23.052479 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-b6drk" event={"ID":"338e932a-1f5c-4a6b-8be1-288700fd3608","Type":"ContainerStarted","Data":"53fa3c4f839f43a367b573cd7c7567fd06bd17fa52d7f7fb70598d7b09b02907"} Apr 17 16:24:23.067238 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:23.067178 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-b6drk" podStartSLOduration=2.525755747 podStartE2EDuration="8.067162861s" podCreationTimestamp="2026-04-17 16:24:15 +0000 UTC" firstStartedPulling="2026-04-17 16:24:16.42820741 +0000 UTC m=+242.808835749" lastFinishedPulling="2026-04-17 16:24:21.969614521 +0000 UTC m=+248.350242863" observedRunningTime="2026-04-17 16:24:23.066540633 +0000 UTC m=+249.447168994" watchObservedRunningTime="2026-04-17 16:24:23.067162861 +0000 UTC m=+249.447791222" Apr 17 16:24:42.205427 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.205346 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5"] Apr 17 16:24:42.209488 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.209472 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.211965 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.211945 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:24:42.212880 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.212863 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:24:42.212939 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.212886 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wxrs2\"" Apr 17 16:24:42.217658 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.217630 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5"] Apr 17 16:24:42.333188 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.333145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.333188 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.333190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.333443 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.333306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22vjj\" (UniqueName: \"kubernetes.io/projected/2b26fed4-bcb0-4798-9765-95295a62b1a6-kube-api-access-22vjj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.433975 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.433943 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.433975 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.433981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.434281 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.434016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22vjj\" (UniqueName: \"kubernetes.io/projected/2b26fed4-bcb0-4798-9765-95295a62b1a6-kube-api-access-22vjj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.434433 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.434413 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.434518 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.434457 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.441571 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.441552 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22vjj\" (UniqueName: \"kubernetes.io/projected/2b26fed4-bcb0-4798-9765-95295a62b1a6-kube-api-access-22vjj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.519286 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.519164 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:24:42.636771 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:42.636745 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5"] Apr 17 16:24:42.639166 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:24:42.639142 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b26fed4_bcb0_4798_9765_95295a62b1a6.slice/crio-e1a22a39f8fdfd55ec167fd9662eb9e1276e8f2de79ee979c461701312e8542f WatchSource:0}: Error finding container e1a22a39f8fdfd55ec167fd9662eb9e1276e8f2de79ee979c461701312e8542f: Status 404 returned error can't find the container with id e1a22a39f8fdfd55ec167fd9662eb9e1276e8f2de79ee979c461701312e8542f Apr 17 16:24:43.107530 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:43.107497 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" event={"ID":"2b26fed4-bcb0-4798-9765-95295a62b1a6","Type":"ContainerStarted","Data":"e1a22a39f8fdfd55ec167fd9662eb9e1276e8f2de79ee979c461701312e8542f"} Apr 17 16:24:51.132071 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:51.132034 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerID="9620eb428deb9eaac63823c95b48972d773ba0da77758660d131d8af767f91fe" exitCode=0 Apr 17 16:24:51.132486 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:51.132098 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" event={"ID":"2b26fed4-bcb0-4798-9765-95295a62b1a6","Type":"ContainerDied","Data":"9620eb428deb9eaac63823c95b48972d773ba0da77758660d131d8af767f91fe"} Apr 17 16:24:53.140096 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:53.140051 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" event={"ID":"2b26fed4-bcb0-4798-9765-95295a62b1a6","Type":"ContainerStarted","Data":"d494742183f377ac1ad1e99720f12022549affd20d087868a723bf14dc3d7976"} Apr 17 16:24:54.144343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:54.144309 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerID="d494742183f377ac1ad1e99720f12022549affd20d087868a723bf14dc3d7976" exitCode=0 Apr 17 16:24:54.144721 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:24:54.144354 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" event={"ID":"2b26fed4-bcb0-4798-9765-95295a62b1a6","Type":"ContainerDied","Data":"d494742183f377ac1ad1e99720f12022549affd20d087868a723bf14dc3d7976"} Apr 17 16:25:03.177903 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:03.177869 2569 generic.go:358] "Generic (PLEG): container finished" podID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerID="2dba1bf944aebd0a8141744b086f2fd08ed65291dc7163948a3541c9527d4dbc" exitCode=0 Apr 17 16:25:03.178292 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:03.177961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" event={"ID":"2b26fed4-bcb0-4798-9765-95295a62b1a6","Type":"ContainerDied","Data":"2dba1bf944aebd0a8141744b086f2fd08ed65291dc7163948a3541c9527d4dbc"} Apr 17 16:25:04.304539 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.304510 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:25:04.318271 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.318247 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-bundle\") pod \"2b26fed4-bcb0-4798-9765-95295a62b1a6\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " Apr 17 16:25:04.318518 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.318316 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22vjj\" (UniqueName: \"kubernetes.io/projected/2b26fed4-bcb0-4798-9765-95295a62b1a6-kube-api-access-22vjj\") pod \"2b26fed4-bcb0-4798-9765-95295a62b1a6\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " Apr 17 16:25:04.318518 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.318359 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-util\") pod \"2b26fed4-bcb0-4798-9765-95295a62b1a6\" (UID: \"2b26fed4-bcb0-4798-9765-95295a62b1a6\") " Apr 17 16:25:04.319000 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.318959 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-bundle" (OuterVolumeSpecName: "bundle") pod "2b26fed4-bcb0-4798-9765-95295a62b1a6" (UID: "2b26fed4-bcb0-4798-9765-95295a62b1a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:25:04.320843 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.320806 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b26fed4-bcb0-4798-9765-95295a62b1a6-kube-api-access-22vjj" (OuterVolumeSpecName: "kube-api-access-22vjj") pod "2b26fed4-bcb0-4798-9765-95295a62b1a6" (UID: "2b26fed4-bcb0-4798-9765-95295a62b1a6"). InnerVolumeSpecName "kube-api-access-22vjj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:25:04.324658 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.324628 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-util" (OuterVolumeSpecName: "util") pod "2b26fed4-bcb0-4798-9765-95295a62b1a6" (UID: "2b26fed4-bcb0-4798-9765-95295a62b1a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:25:04.419443 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.419404 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:25:04.419443 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.419439 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-22vjj\" (UniqueName: \"kubernetes.io/projected/2b26fed4-bcb0-4798-9765-95295a62b1a6-kube-api-access-22vjj\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:25:04.419443 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:04.419451 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b26fed4-bcb0-4798-9765-95295a62b1a6-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:25:05.185488 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:05.185456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" event={"ID":"2b26fed4-bcb0-4798-9765-95295a62b1a6","Type":"ContainerDied","Data":"e1a22a39f8fdfd55ec167fd9662eb9e1276e8f2de79ee979c461701312e8542f"} Apr 17 16:25:05.185488 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:05.185491 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1a22a39f8fdfd55ec167fd9662eb9e1276e8f2de79ee979c461701312e8542f" Apr 17 16:25:05.185684 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:05.185514 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e57lqf5" Apr 17 16:25:09.492202 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.492149 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46"] Apr 17 16:25:09.492772 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.492683 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerName="pull" Apr 17 16:25:09.492772 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.492704 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerName="pull" Apr 17 16:25:09.492772 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.492740 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerName="util" Apr 17 16:25:09.492772 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.492749 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerName="util" Apr 17 16:25:09.492772 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.492759 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerName="extract" Apr 17 16:25:09.492772 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.492766 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerName="extract" Apr 17 16:25:09.493058 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.492838 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b26fed4-bcb0-4798-9765-95295a62b1a6" containerName="extract" Apr 17 16:25:09.499040 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.499019 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" Apr 17 16:25:09.501795 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.501772 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:25:09.502675 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.502647 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 16:25:09.502864 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.502682 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-ww2sp\"" Apr 17 16:25:09.503875 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.503854 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46"] Apr 17 16:25:09.560672 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.560637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/769f61ce-5dab-4c61-a991-944d72915796-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tts46\" (UID: \"769f61ce-5dab-4c61-a991-944d72915796\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" Apr 17 16:25:09.560828 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.560722 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxb2m\" (UniqueName: \"kubernetes.io/projected/769f61ce-5dab-4c61-a991-944d72915796-kube-api-access-wxb2m\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tts46\" (UID: \"769f61ce-5dab-4c61-a991-944d72915796\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" Apr 17 16:25:09.661582 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.661548 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/769f61ce-5dab-4c61-a991-944d72915796-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tts46\" (UID: \"769f61ce-5dab-4c61-a991-944d72915796\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" Apr 17 16:25:09.661744 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.661614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxb2m\" (UniqueName: \"kubernetes.io/projected/769f61ce-5dab-4c61-a991-944d72915796-kube-api-access-wxb2m\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tts46\" (UID: \"769f61ce-5dab-4c61-a991-944d72915796\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" Apr 17 16:25:09.661938 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.661918 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/769f61ce-5dab-4c61-a991-944d72915796-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tts46\" (UID: \"769f61ce-5dab-4c61-a991-944d72915796\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" Apr 17 16:25:09.669061 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.669031 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxb2m\" (UniqueName: \"kubernetes.io/projected/769f61ce-5dab-4c61-a991-944d72915796-kube-api-access-wxb2m\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-tts46\" (UID: \"769f61ce-5dab-4c61-a991-944d72915796\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" Apr 17 16:25:09.809447 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.809344 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" Apr 17 16:25:09.934728 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:09.934705 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46"] Apr 17 16:25:09.937110 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:25:09.937078 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod769f61ce_5dab_4c61_a991_944d72915796.slice/crio-ed086583855e5659b7030c165c3d3af7c85d5e9bb625b7fc5066c31810168e1b WatchSource:0}: Error finding container ed086583855e5659b7030c165c3d3af7c85d5e9bb625b7fc5066c31810168e1b: Status 404 returned error can't find the container with id ed086583855e5659b7030c165c3d3af7c85d5e9bb625b7fc5066c31810168e1b Apr 17 16:25:10.199793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:10.199752 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" event={"ID":"769f61ce-5dab-4c61-a991-944d72915796","Type":"ContainerStarted","Data":"ed086583855e5659b7030c165c3d3af7c85d5e9bb625b7fc5066c31810168e1b"} Apr 17 16:25:12.209749 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:12.209666 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" event={"ID":"769f61ce-5dab-4c61-a991-944d72915796","Type":"ContainerStarted","Data":"70fa16ecf2006b3860b8794ad05a2cffd803c6e7d45675c7e920499937e6f797"} Apr 17 16:25:12.228670 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:12.228612 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-tts46" podStartSLOduration=1.332026524 podStartE2EDuration="3.228598612s" podCreationTimestamp="2026-04-17 16:25:09 +0000 UTC" firstStartedPulling="2026-04-17 16:25:09.939428692 +0000 UTC m=+296.320057032" lastFinishedPulling="2026-04-17 16:25:11.836000777 +0000 UTC m=+298.216629120" observedRunningTime="2026-04-17 16:25:12.226841889 +0000 UTC m=+298.607470241" watchObservedRunningTime="2026-04-17 16:25:12.228598612 +0000 UTC m=+298.609227004" Apr 17 16:25:13.396376 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.396339 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm"] Apr 17 16:25:13.399884 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.399867 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.402284 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.402259 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:25:13.403314 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.403299 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wxrs2\"" Apr 17 16:25:13.403374 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.403321 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:25:13.407592 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.407572 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm"] Apr 17 16:25:13.497397 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.497359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.497562 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.497404 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trg7k\" (UniqueName: \"kubernetes.io/projected/85ee360f-e36d-4161-bd03-ba315296548e-kube-api-access-trg7k\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.497562 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.497490 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.598600 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.598555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.598859 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.598834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trg7k\" (UniqueName: \"kubernetes.io/projected/85ee360f-e36d-4161-bd03-ba315296548e-kube-api-access-trg7k\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.599041 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.599015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.599162 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.599021 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.599478 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.599454 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.608197 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.608157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trg7k\" (UniqueName: \"kubernetes.io/projected/85ee360f-e36d-4161-bd03-ba315296548e-kube-api-access-trg7k\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.710362 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.710280 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:13.834100 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:13.834075 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm"] Apr 17 16:25:13.837366 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:25:13.837334 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85ee360f_e36d_4161_bd03_ba315296548e.slice/crio-6db50dc0b9a4ae3917c3c61df5735ec78a4bb1c515b95587e983f16de1ad8588 WatchSource:0}: Error finding container 6db50dc0b9a4ae3917c3c61df5735ec78a4bb1c515b95587e983f16de1ad8588: Status 404 returned error can't find the container with id 6db50dc0b9a4ae3917c3c61df5735ec78a4bb1c515b95587e983f16de1ad8588 Apr 17 16:25:14.102737 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:14.102619 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:25:14.216845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:14.216696 2569 generic.go:358] "Generic (PLEG): container finished" podID="85ee360f-e36d-4161-bd03-ba315296548e" containerID="8fdf51cfeb646801cfecb50f9a98e7b2d83d0fbe66ae82e5b4657ba415eb280b" exitCode=0 Apr 17 16:25:14.216845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:14.216764 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" event={"ID":"85ee360f-e36d-4161-bd03-ba315296548e","Type":"ContainerDied","Data":"8fdf51cfeb646801cfecb50f9a98e7b2d83d0fbe66ae82e5b4657ba415eb280b"} Apr 17 16:25:14.216845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:14.216797 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" event={"ID":"85ee360f-e36d-4161-bd03-ba315296548e","Type":"ContainerStarted","Data":"6db50dc0b9a4ae3917c3c61df5735ec78a4bb1c515b95587e983f16de1ad8588"} Apr 17 16:25:16.223940 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:16.223909 2569 generic.go:358] "Generic (PLEG): container finished" podID="85ee360f-e36d-4161-bd03-ba315296548e" containerID="2adbc25fd9b84e44c4941b98b8bc87aa6cc280a89dfd68df4d9c6da2e3d10a11" exitCode=0 Apr 17 16:25:16.224317 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:16.223957 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" event={"ID":"85ee360f-e36d-4161-bd03-ba315296548e","Type":"ContainerDied","Data":"2adbc25fd9b84e44c4941b98b8bc87aa6cc280a89dfd68df4d9c6da2e3d10a11"} Apr 17 16:25:16.224948 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:16.224932 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:25:17.229240 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:17.229188 2569 generic.go:358] "Generic (PLEG): container finished" podID="85ee360f-e36d-4161-bd03-ba315296548e" containerID="c97687bff09f667797540c4e9fe020863e5c31ac7dc19449c302dee99611e9ce" exitCode=0 Apr 17 16:25:17.229659 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:17.229278 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" event={"ID":"85ee360f-e36d-4161-bd03-ba315296548e","Type":"ContainerDied","Data":"c97687bff09f667797540c4e9fe020863e5c31ac7dc19449c302dee99611e9ce"} Apr 17 16:25:18.240017 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.239983 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-cxkwl"] Apr 17 16:25:18.243949 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.243930 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:18.246569 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.246548 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 16:25:18.246748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.246553 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 16:25:18.247441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.247406 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bf2h8\"" Apr 17 16:25:18.250074 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.250050 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-cxkwl"] Apr 17 16:25:18.335353 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.335320 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e88ee7a6-59a2-4a87-81d8-deabc0ad642e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-cxkwl\" (UID: \"e88ee7a6-59a2-4a87-81d8-deabc0ad642e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:18.335535 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.335438 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8ncz\" (UniqueName: \"kubernetes.io/projected/e88ee7a6-59a2-4a87-81d8-deabc0ad642e-kube-api-access-s8ncz\") pod \"cert-manager-webhook-597b96b99b-cxkwl\" (UID: \"e88ee7a6-59a2-4a87-81d8-deabc0ad642e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:18.361327 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.361303 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:18.436030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.435995 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trg7k\" (UniqueName: \"kubernetes.io/projected/85ee360f-e36d-4161-bd03-ba315296548e-kube-api-access-trg7k\") pod \"85ee360f-e36d-4161-bd03-ba315296548e\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " Apr 17 16:25:18.436212 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.436046 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-util\") pod \"85ee360f-e36d-4161-bd03-ba315296548e\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " Apr 17 16:25:18.436212 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.436079 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-bundle\") pod \"85ee360f-e36d-4161-bd03-ba315296548e\" (UID: \"85ee360f-e36d-4161-bd03-ba315296548e\") " Apr 17 16:25:18.436313 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.436270 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8ncz\" (UniqueName: \"kubernetes.io/projected/e88ee7a6-59a2-4a87-81d8-deabc0ad642e-kube-api-access-s8ncz\") pod \"cert-manager-webhook-597b96b99b-cxkwl\" (UID: \"e88ee7a6-59a2-4a87-81d8-deabc0ad642e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:18.436358 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.436329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e88ee7a6-59a2-4a87-81d8-deabc0ad642e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-cxkwl\" (UID: \"e88ee7a6-59a2-4a87-81d8-deabc0ad642e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:18.436614 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.436579 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-bundle" (OuterVolumeSpecName: "bundle") pod "85ee360f-e36d-4161-bd03-ba315296548e" (UID: "85ee360f-e36d-4161-bd03-ba315296548e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:25:18.438213 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.438187 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ee360f-e36d-4161-bd03-ba315296548e-kube-api-access-trg7k" (OuterVolumeSpecName: "kube-api-access-trg7k") pod "85ee360f-e36d-4161-bd03-ba315296548e" (UID: "85ee360f-e36d-4161-bd03-ba315296548e"). InnerVolumeSpecName "kube-api-access-trg7k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:25:18.441143 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.441108 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-util" (OuterVolumeSpecName: "util") pod "85ee360f-e36d-4161-bd03-ba315296548e" (UID: "85ee360f-e36d-4161-bd03-ba315296548e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:25:18.443680 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.443652 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e88ee7a6-59a2-4a87-81d8-deabc0ad642e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-cxkwl\" (UID: \"e88ee7a6-59a2-4a87-81d8-deabc0ad642e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:18.443881 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.443862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8ncz\" (UniqueName: \"kubernetes.io/projected/e88ee7a6-59a2-4a87-81d8-deabc0ad642e-kube-api-access-s8ncz\") pod \"cert-manager-webhook-597b96b99b-cxkwl\" (UID: \"e88ee7a6-59a2-4a87-81d8-deabc0ad642e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:18.536932 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.536839 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:25:18.536932 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.536874 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-trg7k\" (UniqueName: \"kubernetes.io/projected/85ee360f-e36d-4161-bd03-ba315296548e-kube-api-access-trg7k\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:25:18.536932 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.536888 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85ee360f-e36d-4161-bd03-ba315296548e-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:25:18.568994 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.568950 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:18.690434 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:18.690404 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-cxkwl"] Apr 17 16:25:18.692552 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:25:18.692522 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode88ee7a6_59a2_4a87_81d8_deabc0ad642e.slice/crio-ab268da69bf8cf9d73ecd52725ca2d335015f9961eddda15ac61af1e668eed9e WatchSource:0}: Error finding container ab268da69bf8cf9d73ecd52725ca2d335015f9961eddda15ac61af1e668eed9e: Status 404 returned error can't find the container with id ab268da69bf8cf9d73ecd52725ca2d335015f9961eddda15ac61af1e668eed9e Apr 17 16:25:19.184644 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.184609 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xcd97"] Apr 17 16:25:19.184929 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.184918 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85ee360f-e36d-4161-bd03-ba315296548e" containerName="util" Apr 17 16:25:19.184972 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.184931 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ee360f-e36d-4161-bd03-ba315296548e" containerName="util" Apr 17 16:25:19.184972 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.184943 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85ee360f-e36d-4161-bd03-ba315296548e" containerName="pull" Apr 17 16:25:19.184972 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.184949 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ee360f-e36d-4161-bd03-ba315296548e" containerName="pull" Apr 17 16:25:19.184972 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.184965 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85ee360f-e36d-4161-bd03-ba315296548e" containerName="extract" Apr 17 16:25:19.184972 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.184970 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ee360f-e36d-4161-bd03-ba315296548e" containerName="extract" Apr 17 16:25:19.185115 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.185031 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="85ee360f-e36d-4161-bd03-ba315296548e" containerName="extract" Apr 17 16:25:19.189030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.189012 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" Apr 17 16:25:19.191247 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.191207 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-kkbxk\"" Apr 17 16:25:19.195771 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.195744 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xcd97"] Apr 17 16:25:19.237714 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.237688 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" Apr 17 16:25:19.237714 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.237702 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjhrzm" event={"ID":"85ee360f-e36d-4161-bd03-ba315296548e","Type":"ContainerDied","Data":"6db50dc0b9a4ae3917c3c61df5735ec78a4bb1c515b95587e983f16de1ad8588"} Apr 17 16:25:19.237932 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.237726 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6db50dc0b9a4ae3917c3c61df5735ec78a4bb1c515b95587e983f16de1ad8588" Apr 17 16:25:19.238787 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.238756 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" event={"ID":"e88ee7a6-59a2-4a87-81d8-deabc0ad642e","Type":"ContainerStarted","Data":"ab268da69bf8cf9d73ecd52725ca2d335015f9961eddda15ac61af1e668eed9e"} Apr 17 16:25:19.242845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.242828 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbf25\" (UniqueName: \"kubernetes.io/projected/7369853a-08ad-4a15-b065-267869763028-kube-api-access-fbf25\") pod \"cert-manager-cainjector-8966b78d4-xcd97\" (UID: \"7369853a-08ad-4a15-b065-267869763028\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" Apr 17 16:25:19.243159 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.242869 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7369853a-08ad-4a15-b065-267869763028-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xcd97\" (UID: \"7369853a-08ad-4a15-b065-267869763028\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" Apr 17 16:25:19.344355 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.344316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbf25\" (UniqueName: \"kubernetes.io/projected/7369853a-08ad-4a15-b065-267869763028-kube-api-access-fbf25\") pod \"cert-manager-cainjector-8966b78d4-xcd97\" (UID: \"7369853a-08ad-4a15-b065-267869763028\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" Apr 17 16:25:19.344555 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.344378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7369853a-08ad-4a15-b065-267869763028-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xcd97\" (UID: \"7369853a-08ad-4a15-b065-267869763028\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" Apr 17 16:25:19.352908 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.352877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7369853a-08ad-4a15-b065-267869763028-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-xcd97\" (UID: \"7369853a-08ad-4a15-b065-267869763028\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" Apr 17 16:25:19.353050 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.353018 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbf25\" (UniqueName: \"kubernetes.io/projected/7369853a-08ad-4a15-b065-267869763028-kube-api-access-fbf25\") pod \"cert-manager-cainjector-8966b78d4-xcd97\" (UID: \"7369853a-08ad-4a15-b065-267869763028\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" Apr 17 16:25:19.500738 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.500649 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" Apr 17 16:25:19.623043 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:19.623014 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-xcd97"] Apr 17 16:25:19.625876 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:25:19.625848 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7369853a_08ad_4a15_b065_267869763028.slice/crio-56439f1938c86b5ad7eae5ffc710081c9c1e5371b6867554c35f5a63d5d9de0d WatchSource:0}: Error finding container 56439f1938c86b5ad7eae5ffc710081c9c1e5371b6867554c35f5a63d5d9de0d: Status 404 returned error can't find the container with id 56439f1938c86b5ad7eae5ffc710081c9c1e5371b6867554c35f5a63d5d9de0d Apr 17 16:25:20.242977 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:20.242941 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" event={"ID":"7369853a-08ad-4a15-b065-267869763028","Type":"ContainerStarted","Data":"56439f1938c86b5ad7eae5ffc710081c9c1e5371b6867554c35f5a63d5d9de0d"} Apr 17 16:25:23.256000 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:23.255908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" event={"ID":"e88ee7a6-59a2-4a87-81d8-deabc0ad642e","Type":"ContainerStarted","Data":"c780f3e1c13322cf6840fcd16753f14d1f8789e4554c3dff2f316808cdcef3aa"} Apr 17 16:25:23.256528 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:23.256017 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:23.257398 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:23.257376 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" event={"ID":"7369853a-08ad-4a15-b065-267869763028","Type":"ContainerStarted","Data":"5cf92e4252c25f16b8f30524718bf1b57c7a0ceacf12af8d20461e4ff4823298"} Apr 17 16:25:23.270740 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:23.270685 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" podStartSLOduration=1.015859202 podStartE2EDuration="5.270672285s" podCreationTimestamp="2026-04-17 16:25:18 +0000 UTC" firstStartedPulling="2026-04-17 16:25:18.694405852 +0000 UTC m=+305.075034191" lastFinishedPulling="2026-04-17 16:25:22.949218928 +0000 UTC m=+309.329847274" observedRunningTime="2026-04-17 16:25:23.269481396 +0000 UTC m=+309.650109758" watchObservedRunningTime="2026-04-17 16:25:23.270672285 +0000 UTC m=+309.651300648" Apr 17 16:25:23.283723 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:23.283670 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-xcd97" podStartSLOduration=0.962662098 podStartE2EDuration="4.283655879s" podCreationTimestamp="2026-04-17 16:25:19 +0000 UTC" firstStartedPulling="2026-04-17 16:25:19.627751632 +0000 UTC m=+306.008379971" lastFinishedPulling="2026-04-17 16:25:22.948745406 +0000 UTC m=+309.329373752" observedRunningTime="2026-04-17 16:25:23.281711748 +0000 UTC m=+309.662340122" watchObservedRunningTime="2026-04-17 16:25:23.283655879 +0000 UTC m=+309.664284240" Apr 17 16:25:26.203150 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.203087 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l"] Apr 17 16:25:26.206424 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.206407 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" Apr 17 16:25:26.208765 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.208732 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:25:26.209620 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.209600 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 16:25:26.209726 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.209691 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-g94rq\"" Apr 17 16:25:26.217370 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.217348 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l"] Apr 17 16:25:26.303574 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.303538 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e342dd7d-02bc-41b4-a004-04300ce0d1e5-tmp\") pod \"openshift-lws-operator-bfc7f696d-nvt7l\" (UID: \"e342dd7d-02bc-41b4-a004-04300ce0d1e5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" Apr 17 16:25:26.303748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.303659 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgn96\" (UniqueName: \"kubernetes.io/projected/e342dd7d-02bc-41b4-a004-04300ce0d1e5-kube-api-access-pgn96\") pod \"openshift-lws-operator-bfc7f696d-nvt7l\" (UID: \"e342dd7d-02bc-41b4-a004-04300ce0d1e5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" Apr 17 16:25:26.405067 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.405029 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e342dd7d-02bc-41b4-a004-04300ce0d1e5-tmp\") pod \"openshift-lws-operator-bfc7f696d-nvt7l\" (UID: \"e342dd7d-02bc-41b4-a004-04300ce0d1e5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" Apr 17 16:25:26.405067 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.405066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgn96\" (UniqueName: \"kubernetes.io/projected/e342dd7d-02bc-41b4-a004-04300ce0d1e5-kube-api-access-pgn96\") pod \"openshift-lws-operator-bfc7f696d-nvt7l\" (UID: \"e342dd7d-02bc-41b4-a004-04300ce0d1e5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" Apr 17 16:25:26.405486 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.405463 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e342dd7d-02bc-41b4-a004-04300ce0d1e5-tmp\") pod \"openshift-lws-operator-bfc7f696d-nvt7l\" (UID: \"e342dd7d-02bc-41b4-a004-04300ce0d1e5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" Apr 17 16:25:26.412549 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.412515 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgn96\" (UniqueName: \"kubernetes.io/projected/e342dd7d-02bc-41b4-a004-04300ce0d1e5-kube-api-access-pgn96\") pod \"openshift-lws-operator-bfc7f696d-nvt7l\" (UID: \"e342dd7d-02bc-41b4-a004-04300ce0d1e5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" Apr 17 16:25:26.516687 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.516599 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" Apr 17 16:25:26.636651 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:26.636620 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l"] Apr 17 16:25:26.640200 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:25:26.640164 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode342dd7d_02bc_41b4_a004_04300ce0d1e5.slice/crio-b47cbc26caf6a0f7078a71a931027324b1c235437615194906a19b273f4ae0fc WatchSource:0}: Error finding container b47cbc26caf6a0f7078a71a931027324b1c235437615194906a19b273f4ae0fc: Status 404 returned error can't find the container with id b47cbc26caf6a0f7078a71a931027324b1c235437615194906a19b273f4ae0fc Apr 17 16:25:27.271828 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:27.271785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" event={"ID":"e342dd7d-02bc-41b4-a004-04300ce0d1e5","Type":"ContainerStarted","Data":"b47cbc26caf6a0f7078a71a931027324b1c235437615194906a19b273f4ae0fc"} Apr 17 16:25:29.263222 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:29.263188 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-cxkwl" Apr 17 16:25:29.280757 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:29.280725 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" event={"ID":"e342dd7d-02bc-41b4-a004-04300ce0d1e5","Type":"ContainerStarted","Data":"cc1ced1c426fe79babe1d6aa186b230cb27a2d7761f13551ca0f8418e26325f7"} Apr 17 16:25:29.296280 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:29.296193 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nvt7l" podStartSLOduration=1.553046529 podStartE2EDuration="3.296168316s" podCreationTimestamp="2026-04-17 16:25:26 +0000 UTC" firstStartedPulling="2026-04-17 16:25:26.641631499 +0000 UTC m=+313.022259838" lastFinishedPulling="2026-04-17 16:25:28.384753286 +0000 UTC m=+314.765381625" observedRunningTime="2026-04-17 16:25:29.295141312 +0000 UTC m=+315.675769674" watchObservedRunningTime="2026-04-17 16:25:29.296168316 +0000 UTC m=+315.676796680" Apr 17 16:25:33.980000 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:33.979966 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr"] Apr 17 16:25:34.007679 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.007643 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr"] Apr 17 16:25:34.007844 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.007811 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.010370 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.010348 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:25:34.010488 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.010352 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wxrs2\"" Apr 17 16:25:34.010488 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.010397 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:25:34.074575 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.074542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.074728 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.074594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.074728 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.074684 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjf2s\" (UniqueName: \"kubernetes.io/projected/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-kube-api-access-zjf2s\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.175847 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.175810 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf2s\" (UniqueName: \"kubernetes.io/projected/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-kube-api-access-zjf2s\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.176017 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.175870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.176017 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.175999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.176309 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.176289 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.176356 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.176323 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.183680 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.183658 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjf2s\" (UniqueName: \"kubernetes.io/projected/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-kube-api-access-zjf2s\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.317307 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.317274 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:25:34.440067 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:34.440032 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr"] Apr 17 16:25:34.443324 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:25:34.443293 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7853be8f_465b_4bd8_bc5f_fbd17d2a4ee3.slice/crio-4d9dd6cb64a540d7349715e8a01de58ede55de1c69b03ad14922a2f966fc3d2a WatchSource:0}: Error finding container 4d9dd6cb64a540d7349715e8a01de58ede55de1c69b03ad14922a2f966fc3d2a: Status 404 returned error can't find the container with id 4d9dd6cb64a540d7349715e8a01de58ede55de1c69b03ad14922a2f966fc3d2a Apr 17 16:25:35.303058 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:35.303025 2569 generic.go:358] "Generic (PLEG): container finished" podID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerID="5348cd70d44f1d67da4ffc9164d21162e11174710c6eda39b7296f7ef02a4fc5" exitCode=0 Apr 17 16:25:35.303567 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:35.303103 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" event={"ID":"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3","Type":"ContainerDied","Data":"5348cd70d44f1d67da4ffc9164d21162e11174710c6eda39b7296f7ef02a4fc5"} Apr 17 16:25:35.303567 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:35.303132 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" event={"ID":"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3","Type":"ContainerStarted","Data":"4d9dd6cb64a540d7349715e8a01de58ede55de1c69b03ad14922a2f966fc3d2a"} Apr 17 16:25:42.109067 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.109024 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2"] Apr 17 16:25:42.111954 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.111938 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.115669 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.115641 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 16:25:42.115796 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.115646 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 16:25:42.115796 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.115654 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 16:25:42.115796 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.115669 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-gzmb9\"" Apr 17 16:25:42.121516 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.121495 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2"] Apr 17 16:25:42.249771 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.249735 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46da63db-1f77-4978-a1ab-a8bd31c70bb0-cert\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.249771 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.249773 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lngkf\" (UniqueName: \"kubernetes.io/projected/46da63db-1f77-4978-a1ab-a8bd31c70bb0-kube-api-access-lngkf\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.250011 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.249874 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/46da63db-1f77-4978-a1ab-a8bd31c70bb0-metrics-cert\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.250011 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.249914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/46da63db-1f77-4978-a1ab-a8bd31c70bb0-manager-config\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.350937 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.350899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/46da63db-1f77-4978-a1ab-a8bd31c70bb0-metrics-cert\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.350937 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.350939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/46da63db-1f77-4978-a1ab-a8bd31c70bb0-manager-config\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.351195 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.351081 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46da63db-1f77-4978-a1ab-a8bd31c70bb0-cert\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.351195 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.351129 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lngkf\" (UniqueName: \"kubernetes.io/projected/46da63db-1f77-4978-a1ab-a8bd31c70bb0-kube-api-access-lngkf\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.351576 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.351549 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/46da63db-1f77-4978-a1ab-a8bd31c70bb0-manager-config\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.353454 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.353431 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46da63db-1f77-4978-a1ab-a8bd31c70bb0-cert\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.353454 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.353443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/46da63db-1f77-4978-a1ab-a8bd31c70bb0-metrics-cert\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.359026 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.359004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lngkf\" (UniqueName: \"kubernetes.io/projected/46da63db-1f77-4978-a1ab-a8bd31c70bb0-kube-api-access-lngkf\") pod \"lws-controller-manager-7bf4f445d7-sz9l2\" (UID: \"46da63db-1f77-4978-a1ab-a8bd31c70bb0\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.421814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.421722 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:42.566028 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:42.566001 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2"] Apr 17 16:25:42.568242 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:25:42.568201 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46da63db_1f77_4978_a1ab_a8bd31c70bb0.slice/crio-f6507c1093d22f256a3524603c4f4b0b4465f06ae3daf2cf6b5a6f2e6b28be51 WatchSource:0}: Error finding container f6507c1093d22f256a3524603c4f4b0b4465f06ae3daf2cf6b5a6f2e6b28be51: Status 404 returned error can't find the container with id f6507c1093d22f256a3524603c4f4b0b4465f06ae3daf2cf6b5a6f2e6b28be51 Apr 17 16:25:43.332506 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:43.332468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" event={"ID":"46da63db-1f77-4978-a1ab-a8bd31c70bb0","Type":"ContainerStarted","Data":"f6507c1093d22f256a3524603c4f4b0b4465f06ae3daf2cf6b5a6f2e6b28be51"} Apr 17 16:25:45.856417 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:25:45.856365 2569 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: reading blob sha256:9081f33a86ffba20d0d1f48478eb518f22057563dc25b9dea71fad43f703d5bf: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/community-operator-pipeline-prod/opendatahub-operator:3.4.0-ea.1" Apr 17 16:25:45.856893 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:25:45.856568 2569 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:pull,Image:quay.io/community-operator-pipeline-prod/opendatahub-operator:3.4.0-ea.1,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjf2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000440000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr_openshift-marketplace(7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3): ErrImagePull: unable to pull image or OCI artifact: pull image err: reading blob sha256:9081f33a86ffba20d0d1f48478eb518f22057563dc25b9dea71fad43f703d5bf: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 16:25:45.857779 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:25:45.857740 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: reading blob sha256:9081f33a86ffba20d0d1f48478eb518f22057563dc25b9dea71fad43f703d5bf: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" Apr 17 16:25:46.344510 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:46.344423 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" event={"ID":"46da63db-1f77-4978-a1ab-a8bd31c70bb0","Type":"ContainerStarted","Data":"1ed25397d4c13b420944ba325f59e84b2423d3fc156404e55f7a9464c9c9fe36"} Apr 17 16:25:46.344655 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:46.344629 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:25:46.345203 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:25:46.345179 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/community-operator-pipeline-prod/opendatahub-operator:3.4.0-ea.1\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: reading blob sha256:9081f33a86ffba20d0d1f48478eb518f22057563dc25b9dea71fad43f703d5bf: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" Apr 17 16:25:46.361621 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:46.361566 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" podStartSLOduration=0.853806825 podStartE2EDuration="4.361552776s" podCreationTimestamp="2026-04-17 16:25:42 +0000 UTC" firstStartedPulling="2026-04-17 16:25:42.570512151 +0000 UTC m=+328.951140493" lastFinishedPulling="2026-04-17 16:25:46.078258097 +0000 UTC m=+332.458886444" observedRunningTime="2026-04-17 16:25:46.360171293 +0000 UTC m=+332.740799665" watchObservedRunningTime="2026-04-17 16:25:46.361552776 +0000 UTC m=+332.742181138" Apr 17 16:25:57.350702 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:25:57.350624 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-sz9l2" Apr 17 16:26:01.398200 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:01.398168 2569 generic.go:358] "Generic (PLEG): container finished" podID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerID="8a45c2c1813507689b5d4e66d239bf6b0d6b4633dabb5a59ec0d7ef49b438de2" exitCode=0 Apr 17 16:26:01.398637 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:01.398257 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" event={"ID":"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3","Type":"ContainerDied","Data":"8a45c2c1813507689b5d4e66d239bf6b0d6b4633dabb5a59ec0d7ef49b438de2"} Apr 17 16:26:02.403002 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:02.402968 2569 generic.go:358] "Generic (PLEG): container finished" podID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerID="268dabc907963b5802ba67ece89c22a0340875e87e046518663220f150936f6b" exitCode=0 Apr 17 16:26:02.403431 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:02.403022 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" event={"ID":"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3","Type":"ContainerDied","Data":"268dabc907963b5802ba67ece89c22a0340875e87e046518663220f150936f6b"} Apr 17 16:26:03.522722 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.522698 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:26:03.523873 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.523855 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-bundle\") pod \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " Apr 17 16:26:03.523928 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.523885 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-util\") pod \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " Apr 17 16:26:03.523962 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.523945 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjf2s\" (UniqueName: \"kubernetes.io/projected/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-kube-api-access-zjf2s\") pod \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\" (UID: \"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3\") " Apr 17 16:26:03.524555 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.524530 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-bundle" (OuterVolumeSpecName: "bundle") pod "7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" (UID: "7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:26:03.526119 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.526091 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-kube-api-access-zjf2s" (OuterVolumeSpecName: "kube-api-access-zjf2s") pod "7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" (UID: "7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3"). InnerVolumeSpecName "kube-api-access-zjf2s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:26:03.528723 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.528700 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-util" (OuterVolumeSpecName: "util") pod "7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" (UID: "7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:26:03.624923 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.624890 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zjf2s\" (UniqueName: \"kubernetes.io/projected/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-kube-api-access-zjf2s\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:03.624923 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.624917 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:03.624923 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:03.624927 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:04.411065 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:04.411036 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" Apr 17 16:26:04.411065 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:04.411039 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5zsctr" event={"ID":"7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3","Type":"ContainerDied","Data":"4d9dd6cb64a540d7349715e8a01de58ede55de1c69b03ad14922a2f966fc3d2a"} Apr 17 16:26:04.411065 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:04.411070 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d9dd6cb64a540d7349715e8a01de58ede55de1c69b03ad14922a2f966fc3d2a" Apr 17 16:26:09.732811 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.732778 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks"] Apr 17 16:26:09.733278 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.733102 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerName="util" Apr 17 16:26:09.733278 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.733112 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerName="util" Apr 17 16:26:09.733278 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.733122 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerName="extract" Apr 17 16:26:09.733278 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.733127 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerName="extract" Apr 17 16:26:09.733278 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.733136 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerName="pull" Apr 17 16:26:09.733278 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.733143 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerName="pull" Apr 17 16:26:09.733278 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.733196 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7853be8f-465b-4bd8-bc5f-fbd17d2a4ee3" containerName="extract" Apr 17 16:26:09.737522 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.737504 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.739980 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.739954 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:26:09.740107 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.740003 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:26:09.741054 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.741031 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wxrs2\"" Apr 17 16:26:09.750701 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.750677 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks"] Apr 17 16:26:09.771850 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.771825 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.771961 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.771858 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dxkc\" (UniqueName: \"kubernetes.io/projected/ef0d35e2-887c-48e5-a563-2be94d8e747a-kube-api-access-9dxkc\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.772017 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.771977 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.872651 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.872608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.872651 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.872647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dxkc\" (UniqueName: \"kubernetes.io/projected/ef0d35e2-887c-48e5-a563-2be94d8e747a-kube-api-access-9dxkc\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.872919 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.872710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.873000 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.872979 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.873044 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.873031 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:09.890893 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:09.890859 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dxkc\" (UniqueName: \"kubernetes.io/projected/ef0d35e2-887c-48e5-a563-2be94d8e747a-kube-api-access-9dxkc\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:10.046208 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:10.046107 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:10.180282 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:10.180209 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks"] Apr 17 16:26:10.182402 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:26:10.182371 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef0d35e2_887c_48e5_a563_2be94d8e747a.slice/crio-438d3ff57500094ef98824f7f7423b42447a4d544c70557c307c2b6cc5204baf WatchSource:0}: Error finding container 438d3ff57500094ef98824f7f7423b42447a4d544c70557c307c2b6cc5204baf: Status 404 returned error can't find the container with id 438d3ff57500094ef98824f7f7423b42447a4d544c70557c307c2b6cc5204baf Apr 17 16:26:10.432662 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:10.432626 2569 generic.go:358] "Generic (PLEG): container finished" podID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerID="77f2840f9deedd078fa86503b1b2f7a706f0f88828e916e1afa1b2afea614c96" exitCode=0 Apr 17 16:26:10.432877 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:10.432683 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" event={"ID":"ef0d35e2-887c-48e5-a563-2be94d8e747a","Type":"ContainerDied","Data":"77f2840f9deedd078fa86503b1b2f7a706f0f88828e916e1afa1b2afea614c96"} Apr 17 16:26:10.432877 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:10.432719 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" event={"ID":"ef0d35e2-887c-48e5-a563-2be94d8e747a","Type":"ContainerStarted","Data":"438d3ff57500094ef98824f7f7423b42447a4d544c70557c307c2b6cc5204baf"} Apr 17 16:26:11.437795 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.437710 2569 generic.go:358] "Generic (PLEG): container finished" podID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerID="54a12fcbae252446c0672ee48abe1ff3646becbd15cf3730923095c7988f9041" exitCode=0 Apr 17 16:26:11.437795 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.437789 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" event={"ID":"ef0d35e2-887c-48e5-a563-2be94d8e747a","Type":"ContainerDied","Data":"54a12fcbae252446c0672ee48abe1ff3646becbd15cf3730923095c7988f9041"} Apr 17 16:26:11.685960 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.685929 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4"] Apr 17 16:26:11.689310 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.689248 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.692109 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.692085 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 16:26:11.692201 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.692176 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 16:26:11.694486 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.694462 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jlzxq\"" Apr 17 16:26:11.699185 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.699168 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 16:26:11.699494 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.699479 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 16:26:11.705993 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.705974 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4"] Apr 17 16:26:11.787050 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.787016 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpw46\" (UniqueName: \"kubernetes.io/projected/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-kube-api-access-kpw46\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.787224 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.787079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.787224 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.787113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.887653 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.887617 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpw46\" (UniqueName: \"kubernetes.io/projected/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-kube-api-access-kpw46\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.887821 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.887681 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.887821 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.887707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.890153 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.890123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.890290 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.890152 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.897931 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.897903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpw46\" (UniqueName: \"kubernetes.io/projected/177e44a7-ba5c-44b4-901b-5c0baa4df9fe-kube-api-access-kpw46\") pod \"opendatahub-operator-controller-manager-54994d49cf-xncm4\" (UID: \"177e44a7-ba5c-44b4-901b-5c0baa4df9fe\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:11.999938 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:11.999840 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:12.130176 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:12.130139 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4"] Apr 17 16:26:12.134732 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:26:12.134700 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod177e44a7_ba5c_44b4_901b_5c0baa4df9fe.slice/crio-051d5a73164f623598dda24bd8ff83b71cc1c6299210bf650a5c4419c873df4d WatchSource:0}: Error finding container 051d5a73164f623598dda24bd8ff83b71cc1c6299210bf650a5c4419c873df4d: Status 404 returned error can't find the container with id 051d5a73164f623598dda24bd8ff83b71cc1c6299210bf650a5c4419c873df4d Apr 17 16:26:12.442639 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:12.442606 2569 generic.go:358] "Generic (PLEG): container finished" podID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerID="3a78de7e154ef3c6d705c2a297e929b9a7f1a40ca7669a9eb735c7e631d4614e" exitCode=0 Apr 17 16:26:12.443084 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:12.442663 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" event={"ID":"ef0d35e2-887c-48e5-a563-2be94d8e747a","Type":"ContainerDied","Data":"3a78de7e154ef3c6d705c2a297e929b9a7f1a40ca7669a9eb735c7e631d4614e"} Apr 17 16:26:12.443803 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:12.443780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" event={"ID":"177e44a7-ba5c-44b4-901b-5c0baa4df9fe","Type":"ContainerStarted","Data":"051d5a73164f623598dda24bd8ff83b71cc1c6299210bf650a5c4419c873df4d"} Apr 17 16:26:14.241117 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.241091 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:14.309746 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.309706 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dxkc\" (UniqueName: \"kubernetes.io/projected/ef0d35e2-887c-48e5-a563-2be94d8e747a-kube-api-access-9dxkc\") pod \"ef0d35e2-887c-48e5-a563-2be94d8e747a\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " Apr 17 16:26:14.309926 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.309812 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-util\") pod \"ef0d35e2-887c-48e5-a563-2be94d8e747a\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " Apr 17 16:26:14.309926 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.309892 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-bundle\") pod \"ef0d35e2-887c-48e5-a563-2be94d8e747a\" (UID: \"ef0d35e2-887c-48e5-a563-2be94d8e747a\") " Apr 17 16:26:14.310851 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.310815 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-bundle" (OuterVolumeSpecName: "bundle") pod "ef0d35e2-887c-48e5-a563-2be94d8e747a" (UID: "ef0d35e2-887c-48e5-a563-2be94d8e747a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:26:14.312431 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.312404 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef0d35e2-887c-48e5-a563-2be94d8e747a-kube-api-access-9dxkc" (OuterVolumeSpecName: "kube-api-access-9dxkc") pod "ef0d35e2-887c-48e5-a563-2be94d8e747a" (UID: "ef0d35e2-887c-48e5-a563-2be94d8e747a"). InnerVolumeSpecName "kube-api-access-9dxkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:26:14.315903 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.315860 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-util" (OuterVolumeSpecName: "util") pod "ef0d35e2-887c-48e5-a563-2be94d8e747a" (UID: "ef0d35e2-887c-48e5-a563-2be94d8e747a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:26:14.410623 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.410584 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:14.410623 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.410613 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9dxkc\" (UniqueName: \"kubernetes.io/projected/ef0d35e2-887c-48e5-a563-2be94d8e747a-kube-api-access-9dxkc\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:14.410623 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.410630 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef0d35e2-887c-48e5-a563-2be94d8e747a-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:14.453312 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.453268 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" event={"ID":"ef0d35e2-887c-48e5-a563-2be94d8e747a","Type":"ContainerDied","Data":"438d3ff57500094ef98824f7f7423b42447a4d544c70557c307c2b6cc5204baf"} Apr 17 16:26:14.453312 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.453295 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9gwqks" Apr 17 16:26:14.453312 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:14.453311 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="438d3ff57500094ef98824f7f7423b42447a4d544c70557c307c2b6cc5204baf" Apr 17 16:26:15.457831 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:15.457787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" event={"ID":"177e44a7-ba5c-44b4-901b-5c0baa4df9fe","Type":"ContainerStarted","Data":"5513e78d875a744e6a478445a54c0b781bd16674bbb46c08f1176b648cad13a5"} Apr 17 16:26:15.458275 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:15.458024 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:15.476817 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:15.476768 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" podStartSLOduration=1.748842223 podStartE2EDuration="4.476755377s" podCreationTimestamp="2026-04-17 16:26:11 +0000 UTC" firstStartedPulling="2026-04-17 16:26:12.136427622 +0000 UTC m=+358.517055961" lastFinishedPulling="2026-04-17 16:26:14.864340772 +0000 UTC m=+361.244969115" observedRunningTime="2026-04-17 16:26:15.475114895 +0000 UTC m=+361.855743268" watchObservedRunningTime="2026-04-17 16:26:15.476755377 +0000 UTC m=+361.857383737" Apr 17 16:26:26.463509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:26.463481 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-xncm4" Apr 17 16:26:28.514039 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.514004 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw"] Apr 17 16:26:28.514514 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.514498 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerName="pull" Apr 17 16:26:28.514558 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.514517 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerName="pull" Apr 17 16:26:28.514558 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.514536 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerName="util" Apr 17 16:26:28.514558 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.514544 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerName="util" Apr 17 16:26:28.514650 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.514571 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerName="extract" Apr 17 16:26:28.514650 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.514580 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerName="extract" Apr 17 16:26:28.514715 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.514653 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef0d35e2-887c-48e5-a563-2be94d8e747a" containerName="extract" Apr 17 16:26:28.517126 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.517105 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.521356 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.521328 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:26:28.521500 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.521456 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:26:28.522530 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.522513 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wxrs2\"" Apr 17 16:26:28.530311 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.530289 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw"] Apr 17 16:26:28.636112 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.636079 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.636309 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.636191 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpkt5\" (UniqueName: \"kubernetes.io/projected/22301b07-0427-477e-8b05-601a1741ee7e-kube-api-access-qpkt5\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.636309 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.636217 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.737361 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.737326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpkt5\" (UniqueName: \"kubernetes.io/projected/22301b07-0427-477e-8b05-601a1741ee7e-kube-api-access-qpkt5\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.737361 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.737366 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.737619 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.737408 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.737774 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.737756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.737832 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.737797 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.754118 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.754091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpkt5\" (UniqueName: \"kubernetes.io/projected/22301b07-0427-477e-8b05-601a1741ee7e-kube-api-access-qpkt5\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.828179 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.828142 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:28.950179 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:28.950155 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw"] Apr 17 16:26:28.952391 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:26:28.952362 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22301b07_0427_477e_8b05_601a1741ee7e.slice/crio-bb31c3bc3b41a354f8f2f080c4ac6e75ca5ee5295cd530993febd51e640309c8 WatchSource:0}: Error finding container bb31c3bc3b41a354f8f2f080c4ac6e75ca5ee5295cd530993febd51e640309c8: Status 404 returned error can't find the container with id bb31c3bc3b41a354f8f2f080c4ac6e75ca5ee5295cd530993febd51e640309c8 Apr 17 16:26:29.508678 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:29.508639 2569 generic.go:358] "Generic (PLEG): container finished" podID="22301b07-0427-477e-8b05-601a1741ee7e" containerID="da96c02e030442a14bd8e8dd68acb05ce3ba7c4ccd4f99a11e8121378178415f" exitCode=0 Apr 17 16:26:29.508845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:29.508690 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" event={"ID":"22301b07-0427-477e-8b05-601a1741ee7e","Type":"ContainerDied","Data":"da96c02e030442a14bd8e8dd68acb05ce3ba7c4ccd4f99a11e8121378178415f"} Apr 17 16:26:29.508845 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:29.508716 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" event={"ID":"22301b07-0427-477e-8b05-601a1741ee7e","Type":"ContainerStarted","Data":"bb31c3bc3b41a354f8f2f080c4ac6e75ca5ee5295cd530993febd51e640309c8"} Apr 17 16:26:31.517123 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:31.517090 2569 generic.go:358] "Generic (PLEG): container finished" podID="22301b07-0427-477e-8b05-601a1741ee7e" containerID="e26081f02aba4b29ffa48db986dfe85429fb40102215c9b83defee1ccea6bd82" exitCode=0 Apr 17 16:26:31.517641 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:31.517175 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" event={"ID":"22301b07-0427-477e-8b05-601a1741ee7e","Type":"ContainerDied","Data":"e26081f02aba4b29ffa48db986dfe85429fb40102215c9b83defee1ccea6bd82"} Apr 17 16:26:32.527874 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:32.527830 2569 generic.go:358] "Generic (PLEG): container finished" podID="22301b07-0427-477e-8b05-601a1741ee7e" containerID="0fdcf83fa60b976fcc9389980c80d993fb2d21d3cc305d54823bfba73e309d3c" exitCode=0 Apr 17 16:26:32.528377 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:32.527942 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" event={"ID":"22301b07-0427-477e-8b05-601a1741ee7e","Type":"ContainerDied","Data":"0fdcf83fa60b976fcc9389980c80d993fb2d21d3cc305d54823bfba73e309d3c"} Apr 17 16:26:33.653679 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.653655 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:33.778500 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.778462 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-util\") pod \"22301b07-0427-477e-8b05-601a1741ee7e\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " Apr 17 16:26:33.778710 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.778597 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpkt5\" (UniqueName: \"kubernetes.io/projected/22301b07-0427-477e-8b05-601a1741ee7e-kube-api-access-qpkt5\") pod \"22301b07-0427-477e-8b05-601a1741ee7e\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " Apr 17 16:26:33.778710 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.778630 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-bundle\") pod \"22301b07-0427-477e-8b05-601a1741ee7e\" (UID: \"22301b07-0427-477e-8b05-601a1741ee7e\") " Apr 17 16:26:33.779519 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.779490 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-bundle" (OuterVolumeSpecName: "bundle") pod "22301b07-0427-477e-8b05-601a1741ee7e" (UID: "22301b07-0427-477e-8b05-601a1741ee7e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:26:33.780797 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.780771 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22301b07-0427-477e-8b05-601a1741ee7e-kube-api-access-qpkt5" (OuterVolumeSpecName: "kube-api-access-qpkt5") pod "22301b07-0427-477e-8b05-601a1741ee7e" (UID: "22301b07-0427-477e-8b05-601a1741ee7e"). InnerVolumeSpecName "kube-api-access-qpkt5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:26:33.783583 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.783521 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-util" (OuterVolumeSpecName: "util") pod "22301b07-0427-477e-8b05-601a1741ee7e" (UID: "22301b07-0427-477e-8b05-601a1741ee7e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:26:33.879870 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.879823 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qpkt5\" (UniqueName: \"kubernetes.io/projected/22301b07-0427-477e-8b05-601a1741ee7e-kube-api-access-qpkt5\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:33.879870 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.879862 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:33.879870 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:33.879872 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22301b07-0427-477e-8b05-601a1741ee7e-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:34.537720 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:34.537634 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" Apr 17 16:26:34.537893 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:34.537629 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355qdmw" event={"ID":"22301b07-0427-477e-8b05-601a1741ee7e","Type":"ContainerDied","Data":"bb31c3bc3b41a354f8f2f080c4ac6e75ca5ee5295cd530993febd51e640309c8"} Apr 17 16:26:34.537893 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:34.537741 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb31c3bc3b41a354f8f2f080c4ac6e75ca5ee5295cd530993febd51e640309c8" Apr 17 16:26:42.636070 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.636038 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s"] Apr 17 16:26:42.636607 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.636396 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22301b07-0427-477e-8b05-601a1741ee7e" containerName="util" Apr 17 16:26:42.636607 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.636409 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="22301b07-0427-477e-8b05-601a1741ee7e" containerName="util" Apr 17 16:26:42.636607 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.636431 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22301b07-0427-477e-8b05-601a1741ee7e" containerName="pull" Apr 17 16:26:42.636607 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.636436 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="22301b07-0427-477e-8b05-601a1741ee7e" containerName="pull" Apr 17 16:26:42.636607 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.636447 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22301b07-0427-477e-8b05-601a1741ee7e" containerName="extract" Apr 17 16:26:42.636607 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.636452 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="22301b07-0427-477e-8b05-601a1741ee7e" containerName="extract" Apr 17 16:26:42.636607 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.636506 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="22301b07-0427-477e-8b05-601a1741ee7e" containerName="extract" Apr 17 16:26:42.639292 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.639265 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.644042 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.644015 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:26:42.644173 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.644109 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wxrs2\"" Apr 17 16:26:42.652669 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.652639 2569 status_manager.go:895] "Failed to get status for pod" podUID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" err="pods \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:26:42.655636 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.655618 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:26:42.668101 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.668074 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s"] Apr 17 16:26:42.755118 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.755078 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tph4f\" (UniqueName: \"kubernetes.io/projected/2fbdc4d9-b0a5-4d96-981a-255daad4b599-kube-api-access-tph4f\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.755319 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.755147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.755319 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.755200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.855993 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.855956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.856195 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.856046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.856195 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.856100 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tph4f\" (UniqueName: \"kubernetes.io/projected/2fbdc4d9-b0a5-4d96-981a-255daad4b599-kube-api-access-tph4f\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.856381 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.856360 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.856458 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.856436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.866727 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.866696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tph4f\" (UniqueName: \"kubernetes.io/projected/2fbdc4d9-b0a5-4d96-981a-255daad4b599-kube-api-access-tph4f\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:42.948779 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:42.948690 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:43.073544 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:43.073514 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s"] Apr 17 16:26:43.076040 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:26:43.076004 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fbdc4d9_b0a5_4d96_981a_255daad4b599.slice/crio-99f9e4fd884e9fac9bcf61936678317657144b8e9bb41b2fdac59ee8af8834cd WatchSource:0}: Error finding container 99f9e4fd884e9fac9bcf61936678317657144b8e9bb41b2fdac59ee8af8834cd: Status 404 returned error can't find the container with id 99f9e4fd884e9fac9bcf61936678317657144b8e9bb41b2fdac59ee8af8834cd Apr 17 16:26:43.576149 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:43.576115 2569 generic.go:358] "Generic (PLEG): container finished" podID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerID="524a2a30546f2d1d42099416e8d0e66c192b2f9a5bceceb2646f41db0b9ee8d3" exitCode=0 Apr 17 16:26:43.576391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:43.576193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" event={"ID":"2fbdc4d9-b0a5-4d96-981a-255daad4b599","Type":"ContainerDied","Data":"524a2a30546f2d1d42099416e8d0e66c192b2f9a5bceceb2646f41db0b9ee8d3"} Apr 17 16:26:43.576391 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:43.576218 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" event={"ID":"2fbdc4d9-b0a5-4d96-981a-255daad4b599","Type":"ContainerStarted","Data":"99f9e4fd884e9fac9bcf61936678317657144b8e9bb41b2fdac59ee8af8834cd"} Apr 17 16:26:44.582216 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:44.582186 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" event={"ID":"2fbdc4d9-b0a5-4d96-981a-255daad4b599","Type":"ContainerStarted","Data":"4a7c75c82f09f4319a844ae73f5e3887264f3a3b089530ea7590f5d592a3b473"} Apr 17 16:26:45.586742 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:45.586704 2569 generic.go:358] "Generic (PLEG): container finished" podID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerID="4a7c75c82f09f4319a844ae73f5e3887264f3a3b089530ea7590f5d592a3b473" exitCode=0 Apr 17 16:26:45.587136 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:45.586788 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" event={"ID":"2fbdc4d9-b0a5-4d96-981a-255daad4b599","Type":"ContainerDied","Data":"4a7c75c82f09f4319a844ae73f5e3887264f3a3b089530ea7590f5d592a3b473"} Apr 17 16:26:46.591873 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:46.591837 2569 generic.go:358] "Generic (PLEG): container finished" podID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerID="154bc413c498e64963a7f904ae070fbb1222035ba3bda8f36cae478a2eadf007" exitCode=0 Apr 17 16:26:46.592277 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:46.591904 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" event={"ID":"2fbdc4d9-b0a5-4d96-981a-255daad4b599","Type":"ContainerDied","Data":"154bc413c498e64963a7f904ae070fbb1222035ba3bda8f36cae478a2eadf007"} Apr 17 16:26:47.719059 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.719034 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:47.799381 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.799347 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-util\") pod \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " Apr 17 16:26:47.799381 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.799394 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-bundle\") pod \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " Apr 17 16:26:47.799593 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.799449 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tph4f\" (UniqueName: \"kubernetes.io/projected/2fbdc4d9-b0a5-4d96-981a-255daad4b599-kube-api-access-tph4f\") pod \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\" (UID: \"2fbdc4d9-b0a5-4d96-981a-255daad4b599\") " Apr 17 16:26:47.800382 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.800351 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-bundle" (OuterVolumeSpecName: "bundle") pod "2fbdc4d9-b0a5-4d96-981a-255daad4b599" (UID: "2fbdc4d9-b0a5-4d96-981a-255daad4b599"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:26:47.801532 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.801515 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fbdc4d9-b0a5-4d96-981a-255daad4b599-kube-api-access-tph4f" (OuterVolumeSpecName: "kube-api-access-tph4f") pod "2fbdc4d9-b0a5-4d96-981a-255daad4b599" (UID: "2fbdc4d9-b0a5-4d96-981a-255daad4b599"). InnerVolumeSpecName "kube-api-access-tph4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:26:47.804801 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.804761 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-util" (OuterVolumeSpecName: "util") pod "2fbdc4d9-b0a5-4d96-981a-255daad4b599" (UID: "2fbdc4d9-b0a5-4d96-981a-255daad4b599"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:26:47.900161 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.900061 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tph4f\" (UniqueName: \"kubernetes.io/projected/2fbdc4d9-b0a5-4d96-981a-255daad4b599-kube-api-access-tph4f\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:47.900161 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.900099 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:47.900161 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:47.900113 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2fbdc4d9-b0a5-4d96-981a-255daad4b599-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:26:48.600744 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:48.600710 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" event={"ID":"2fbdc4d9-b0a5-4d96-981a-255daad4b599","Type":"ContainerDied","Data":"99f9e4fd884e9fac9bcf61936678317657144b8e9bb41b2fdac59ee8af8834cd"} Apr 17 16:26:48.600912 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:48.600746 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2ngw9s" Apr 17 16:26:48.600912 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:26:48.600747 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99f9e4fd884e9fac9bcf61936678317657144b8e9bb41b2fdac59ee8af8834cd" Apr 17 16:27:04.644836 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.644795 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv"] Apr 17 16:27:04.645397 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.645300 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerName="extract" Apr 17 16:27:04.645397 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.645316 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerName="extract" Apr 17 16:27:04.645397 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.645330 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerName="pull" Apr 17 16:27:04.645397 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.645338 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerName="pull" Apr 17 16:27:04.645397 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.645358 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerName="util" Apr 17 16:27:04.645397 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.645366 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerName="util" Apr 17 16:27:04.645725 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.645449 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fbdc4d9-b0a5-4d96-981a-255daad4b599" containerName="extract" Apr 17 16:27:04.648827 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.648807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.651275 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.651251 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 16:27:04.651388 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.651342 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:27:04.651388 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.651356 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-gkp6x\"" Apr 17 16:27:04.651610 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.651597 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:27:04.658025 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.658001 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv"] Apr 17 16:27:04.748620 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748584 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f5604b22-37d7-4787-9612-bca485e7867d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.748620 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzm7j\" (UniqueName: \"kubernetes.io/projected/f5604b22-37d7-4787-9612-bca485e7867d-kube-api-access-lzm7j\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.748856 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f5604b22-37d7-4787-9612-bca485e7867d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.748856 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748703 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f5604b22-37d7-4787-9612-bca485e7867d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.748856 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748721 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.748856 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748785 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.748856 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.748856 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748851 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.749061 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.748938 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850057 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f5604b22-37d7-4787-9612-bca485e7867d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850069 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850103 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850140 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850262 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850253 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850533 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f5604b22-37d7-4787-9612-bca485e7867d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850533 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzm7j\" (UniqueName: \"kubernetes.io/projected/f5604b22-37d7-4787-9612-bca485e7867d-kube-api-access-lzm7j\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850533 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f5604b22-37d7-4787-9612-bca485e7867d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850686 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850576 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850686 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850648 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850828 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850800 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.850901 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.850831 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.851119 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.851097 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f5604b22-37d7-4787-9612-bca485e7867d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.852842 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.852816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f5604b22-37d7-4787-9612-bca485e7867d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.853078 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.853060 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f5604b22-37d7-4787-9612-bca485e7867d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.857675 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.857650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f5604b22-37d7-4787-9612-bca485e7867d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.858389 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.858366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzm7j\" (UniqueName: \"kubernetes.io/projected/f5604b22-37d7-4787-9612-bca485e7867d-kube-api-access-lzm7j\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv\" (UID: \"f5604b22-37d7-4787-9612-bca485e7867d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:04.962166 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:04.962070 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:05.090260 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:05.090205 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv"] Apr 17 16:27:05.091563 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:27:05.091538 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5604b22_37d7_4787_9612_bca485e7867d.slice/crio-b750399380523660009c209f1e4293a95e02bc081024b265ae7821d684512c7c WatchSource:0}: Error finding container b750399380523660009c209f1e4293a95e02bc081024b265ae7821d684512c7c: Status 404 returned error can't find the container with id b750399380523660009c209f1e4293a95e02bc081024b265ae7821d684512c7c Apr 17 16:27:05.666534 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:05.666489 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" event={"ID":"f5604b22-37d7-4787-9612-bca485e7867d","Type":"ContainerStarted","Data":"b750399380523660009c209f1e4293a95e02bc081024b265ae7821d684512c7c"} Apr 17 16:27:07.432511 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:07.432467 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:27:07.432911 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:07.432560 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:27:07.432911 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:07.432598 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:27:07.677014 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:07.676975 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" event={"ID":"f5604b22-37d7-4787-9612-bca485e7867d","Type":"ContainerStarted","Data":"49baf2efba601250627a75acb486a571d94a5a04126a8fe9eb035929ec922da6"} Apr 17 16:27:07.697335 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:07.697205 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" podStartSLOduration=1.358543829 podStartE2EDuration="3.697186217s" podCreationTimestamp="2026-04-17 16:27:04 +0000 UTC" firstStartedPulling="2026-04-17 16:27:05.093558953 +0000 UTC m=+411.474187306" lastFinishedPulling="2026-04-17 16:27:07.432201339 +0000 UTC m=+413.812829694" observedRunningTime="2026-04-17 16:27:07.694091642 +0000 UTC m=+414.074720004" watchObservedRunningTime="2026-04-17 16:27:07.697186217 +0000 UTC m=+414.077814579" Apr 17 16:27:07.962909 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:07.962809 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:08.967756 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:08.967723 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:09.684160 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:09.684128 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:09.685241 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:09.685209 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv" Apr 17 16:27:29.452714 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.452625 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mqnpf"] Apr 17 16:27:29.456242 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.456206 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" Apr 17 16:27:29.459148 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.459126 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-t9gw2\"" Apr 17 16:27:29.460006 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.459348 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 16:27:29.460540 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.460380 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 16:27:29.464672 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.464652 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mqnpf"] Apr 17 16:27:29.560532 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.560494 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk9kv\" (UniqueName: \"kubernetes.io/projected/af49f1ec-ec08-43e7-b6a7-73f92591d664-kube-api-access-lk9kv\") pod \"kuadrant-operator-catalog-mqnpf\" (UID: \"af49f1ec-ec08-43e7-b6a7-73f92591d664\") " pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" Apr 17 16:27:29.661118 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.661082 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lk9kv\" (UniqueName: \"kubernetes.io/projected/af49f1ec-ec08-43e7-b6a7-73f92591d664-kube-api-access-lk9kv\") pod \"kuadrant-operator-catalog-mqnpf\" (UID: \"af49f1ec-ec08-43e7-b6a7-73f92591d664\") " pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" Apr 17 16:27:29.669183 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.669152 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk9kv\" (UniqueName: \"kubernetes.io/projected/af49f1ec-ec08-43e7-b6a7-73f92591d664-kube-api-access-lk9kv\") pod \"kuadrant-operator-catalog-mqnpf\" (UID: \"af49f1ec-ec08-43e7-b6a7-73f92591d664\") " pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" Apr 17 16:27:29.773926 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.773848 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" Apr 17 16:27:29.811886 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.811856 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mqnpf"] Apr 17 16:27:29.894707 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:29.894685 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mqnpf"] Apr 17 16:27:29.896685 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:27:29.896659 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf49f1ec_ec08_43e7_b6a7_73f92591d664.slice/crio-2386276ce09eef14f57041d282bb8359333ab0ebab10d1633a30695a0f22db4e WatchSource:0}: Error finding container 2386276ce09eef14f57041d282bb8359333ab0ebab10d1633a30695a0f22db4e: Status 404 returned error can't find the container with id 2386276ce09eef14f57041d282bb8359333ab0ebab10d1633a30695a0f22db4e Apr 17 16:27:30.020747 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.020709 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x66nk"] Apr 17 16:27:30.025330 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.025271 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:30.031186 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.031159 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x66nk"] Apr 17 16:27:30.065035 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.065002 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72tv\" (UniqueName: \"kubernetes.io/projected/3289d8f3-9ffd-4bb2-965e-65978d3a1a33-kube-api-access-h72tv\") pod \"kuadrant-operator-catalog-x66nk\" (UID: \"3289d8f3-9ffd-4bb2-965e-65978d3a1a33\") " pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:30.165613 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.165583 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h72tv\" (UniqueName: \"kubernetes.io/projected/3289d8f3-9ffd-4bb2-965e-65978d3a1a33-kube-api-access-h72tv\") pod \"kuadrant-operator-catalog-x66nk\" (UID: \"3289d8f3-9ffd-4bb2-965e-65978d3a1a33\") " pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:30.174625 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.174600 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72tv\" (UniqueName: \"kubernetes.io/projected/3289d8f3-9ffd-4bb2-965e-65978d3a1a33-kube-api-access-h72tv\") pod \"kuadrant-operator-catalog-x66nk\" (UID: \"3289d8f3-9ffd-4bb2-965e-65978d3a1a33\") " pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:30.336615 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.336585 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:30.465645 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.465592 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-x66nk"] Apr 17 16:27:30.468896 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:27:30.468865 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3289d8f3_9ffd_4bb2_965e_65978d3a1a33.slice/crio-f6dcb0c2ed7ce9b8b8a28e0238b1430b193f1d3b976638beeeed861cc96547a4 WatchSource:0}: Error finding container f6dcb0c2ed7ce9b8b8a28e0238b1430b193f1d3b976638beeeed861cc96547a4: Status 404 returned error can't find the container with id f6dcb0c2ed7ce9b8b8a28e0238b1430b193f1d3b976638beeeed861cc96547a4 Apr 17 16:27:30.761979 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.761890 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" event={"ID":"3289d8f3-9ffd-4bb2-965e-65978d3a1a33","Type":"ContainerStarted","Data":"f6dcb0c2ed7ce9b8b8a28e0238b1430b193f1d3b976638beeeed861cc96547a4"} Apr 17 16:27:30.763497 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:30.763466 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" event={"ID":"af49f1ec-ec08-43e7-b6a7-73f92591d664","Type":"ContainerStarted","Data":"2386276ce09eef14f57041d282bb8359333ab0ebab10d1633a30695a0f22db4e"} Apr 17 16:27:32.772716 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:32.772675 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" event={"ID":"3289d8f3-9ffd-4bb2-965e-65978d3a1a33","Type":"ContainerStarted","Data":"289251f3552864512f45b19907f4dc99ae473514b34f7c1b89125832a5d69c12"} Apr 17 16:27:32.774089 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:32.774060 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" event={"ID":"af49f1ec-ec08-43e7-b6a7-73f92591d664","Type":"ContainerStarted","Data":"5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577"} Apr 17 16:27:32.774208 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:32.774156 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" podUID="af49f1ec-ec08-43e7-b6a7-73f92591d664" containerName="registry-server" containerID="cri-o://5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577" gracePeriod=2 Apr 17 16:27:32.788503 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:32.788461 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" podStartSLOduration=1.210225791 podStartE2EDuration="2.788445195s" podCreationTimestamp="2026-04-17 16:27:30 +0000 UTC" firstStartedPulling="2026-04-17 16:27:30.47017373 +0000 UTC m=+436.850802070" lastFinishedPulling="2026-04-17 16:27:32.048393131 +0000 UTC m=+438.429021474" observedRunningTime="2026-04-17 16:27:32.7881973 +0000 UTC m=+439.168825658" watchObservedRunningTime="2026-04-17 16:27:32.788445195 +0000 UTC m=+439.169073550" Apr 17 16:27:32.802414 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:32.802369 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" podStartSLOduration=1.654524255 podStartE2EDuration="3.802354553s" podCreationTimestamp="2026-04-17 16:27:29 +0000 UTC" firstStartedPulling="2026-04-17 16:27:29.897824248 +0000 UTC m=+436.278452587" lastFinishedPulling="2026-04-17 16:27:32.045654543 +0000 UTC m=+438.426282885" observedRunningTime="2026-04-17 16:27:32.801872665 +0000 UTC m=+439.182501027" watchObservedRunningTime="2026-04-17 16:27:32.802354553 +0000 UTC m=+439.182982914" Apr 17 16:27:33.018931 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.018907 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" Apr 17 16:27:33.093208 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.093178 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk9kv\" (UniqueName: \"kubernetes.io/projected/af49f1ec-ec08-43e7-b6a7-73f92591d664-kube-api-access-lk9kv\") pod \"af49f1ec-ec08-43e7-b6a7-73f92591d664\" (UID: \"af49f1ec-ec08-43e7-b6a7-73f92591d664\") " Apr 17 16:27:33.095295 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.095271 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af49f1ec-ec08-43e7-b6a7-73f92591d664-kube-api-access-lk9kv" (OuterVolumeSpecName: "kube-api-access-lk9kv") pod "af49f1ec-ec08-43e7-b6a7-73f92591d664" (UID: "af49f1ec-ec08-43e7-b6a7-73f92591d664"). InnerVolumeSpecName "kube-api-access-lk9kv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:27:33.193912 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.193878 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lk9kv\" (UniqueName: \"kubernetes.io/projected/af49f1ec-ec08-43e7-b6a7-73f92591d664-kube-api-access-lk9kv\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:33.781344 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.781303 2569 generic.go:358] "Generic (PLEG): container finished" podID="af49f1ec-ec08-43e7-b6a7-73f92591d664" containerID="5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577" exitCode=0 Apr 17 16:27:33.781790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.781400 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" event={"ID":"af49f1ec-ec08-43e7-b6a7-73f92591d664","Type":"ContainerDied","Data":"5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577"} Apr 17 16:27:33.781790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.781435 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" event={"ID":"af49f1ec-ec08-43e7-b6a7-73f92591d664","Type":"ContainerDied","Data":"2386276ce09eef14f57041d282bb8359333ab0ebab10d1633a30695a0f22db4e"} Apr 17 16:27:33.781790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.781454 2569 scope.go:117] "RemoveContainer" containerID="5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577" Apr 17 16:27:33.781790 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.781408 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mqnpf" Apr 17 16:27:33.791168 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.791146 2569 scope.go:117] "RemoveContainer" containerID="5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577" Apr 17 16:27:33.791423 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:27:33.791406 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577\": container with ID starting with 5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577 not found: ID does not exist" containerID="5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577" Apr 17 16:27:33.791477 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.791432 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577"} err="failed to get container status \"5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577\": rpc error: code = NotFound desc = could not find container \"5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577\": container with ID starting with 5ccf82d08bf493ec4966972a540b516e7a736544767cec72cc7d925dfd61d577 not found: ID does not exist" Apr 17 16:27:33.803050 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.803023 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mqnpf"] Apr 17 16:27:33.806171 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:33.806151 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mqnpf"] Apr 17 16:27:34.216748 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:34.216713 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af49f1ec-ec08-43e7-b6a7-73f92591d664" path="/var/lib/kubelet/pods/af49f1ec-ec08-43e7-b6a7-73f92591d664/volumes" Apr 17 16:27:40.337504 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:40.337465 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:40.338040 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:40.337516 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:40.360559 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:40.360530 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:40.829788 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:40.829761 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-x66nk" Apr 17 16:27:44.254707 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.254675 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f47b6d896-zckpn"] Apr 17 16:27:44.255210 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.255188 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af49f1ec-ec08-43e7-b6a7-73f92591d664" containerName="registry-server" Apr 17 16:27:44.255283 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.255214 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="af49f1ec-ec08-43e7-b6a7-73f92591d664" containerName="registry-server" Apr 17 16:27:44.255353 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.255340 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="af49f1ec-ec08-43e7-b6a7-73f92591d664" containerName="registry-server" Apr 17 16:27:44.262325 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.262307 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.266550 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.266525 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f47b6d896-zckpn"] Apr 17 16:27:44.392064 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.392032 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-oauth-serving-cert\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.392255 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.392077 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-oauth-config\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.392255 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.392105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-config\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.392255 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.392151 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-serving-cert\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.392255 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.392191 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-trusted-ca-bundle\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.392255 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.392218 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-service-ca\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.392430 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.392316 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqvg7\" (UniqueName: \"kubernetes.io/projected/0b95e642-d03d-4713-923a-ac3c5b1ff4db-kube-api-access-bqvg7\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.493534 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.493502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-oauth-serving-cert\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.493695 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.493626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-oauth-config\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.493695 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.493665 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-config\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.493776 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.493714 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-serving-cert\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.493776 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.493740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-trusted-ca-bundle\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.493878 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.493773 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-service-ca\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.493878 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.493841 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqvg7\" (UniqueName: \"kubernetes.io/projected/0b95e642-d03d-4713-923a-ac3c5b1ff4db-kube-api-access-bqvg7\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.494310 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.494282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-oauth-serving-cert\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.494508 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.494482 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-config\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.494630 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.494482 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-service-ca\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.494703 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.494680 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b95e642-d03d-4713-923a-ac3c5b1ff4db-trusted-ca-bundle\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.496060 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.496042 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-oauth-config\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.496217 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.496198 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0b95e642-d03d-4713-923a-ac3c5b1ff4db-console-serving-cert\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.502359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.502336 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqvg7\" (UniqueName: \"kubernetes.io/projected/0b95e642-d03d-4713-923a-ac3c5b1ff4db-kube-api-access-bqvg7\") pod \"console-5f47b6d896-zckpn\" (UID: \"0b95e642-d03d-4713-923a-ac3c5b1ff4db\") " pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.573196 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.573166 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:44.696934 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.696909 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f47b6d896-zckpn"] Apr 17 16:27:44.698209 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:27:44.698176 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b95e642_d03d_4713_923a_ac3c5b1ff4db.slice/crio-e90972282da0cb3dd81d83a76fbac58b4613860c843cb9e4c198163c3dee234d WatchSource:0}: Error finding container e90972282da0cb3dd81d83a76fbac58b4613860c843cb9e4c198163c3dee234d: Status 404 returned error can't find the container with id e90972282da0cb3dd81d83a76fbac58b4613860c843cb9e4c198163c3dee234d Apr 17 16:27:44.826099 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.826007 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f47b6d896-zckpn" event={"ID":"0b95e642-d03d-4713-923a-ac3c5b1ff4db","Type":"ContainerStarted","Data":"133a50ba5d4539fd16b1cc512b00c85e1493616b60a58c6b90d2aa77b2ea09f7"} Apr 17 16:27:44.826280 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.826156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f47b6d896-zckpn" event={"ID":"0b95e642-d03d-4713-923a-ac3c5b1ff4db","Type":"ContainerStarted","Data":"e90972282da0cb3dd81d83a76fbac58b4613860c843cb9e4c198163c3dee234d"} Apr 17 16:27:44.843835 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:44.843774 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f47b6d896-zckpn" podStartSLOduration=0.843756213 podStartE2EDuration="843.756213ms" podCreationTimestamp="2026-04-17 16:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:27:44.842870878 +0000 UTC m=+451.223499251" watchObservedRunningTime="2026-04-17 16:27:44.843756213 +0000 UTC m=+451.224384575" Apr 17 16:27:45.051697 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.051662 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562"] Apr 17 16:27:45.055154 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.055135 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.057462 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.057439 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-vpm4x\"" Apr 17 16:27:45.064082 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.064054 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562"] Apr 17 16:27:45.201085 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.200987 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.201085 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.201036 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.201085 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.201061 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqv7\" (UniqueName: \"kubernetes.io/projected/2ee32c6e-ade8-495c-96a8-b71e6126eaec-kube-api-access-bpqv7\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.302120 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.302073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.302588 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.302132 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.302588 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.302168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqv7\" (UniqueName: \"kubernetes.io/projected/2ee32c6e-ade8-495c-96a8-b71e6126eaec-kube-api-access-bpqv7\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.302588 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.302558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.302588 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.302570 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.311665 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.311632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqv7\" (UniqueName: \"kubernetes.io/projected/2ee32c6e-ade8-495c-96a8-b71e6126eaec-kube-api-access-bpqv7\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.365411 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.365381 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:45.489076 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.489042 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562"] Apr 17 16:27:45.490671 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:27:45.490645 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee32c6e_ade8_495c_96a8_b71e6126eaec.slice/crio-db7cc03a149ca657c23592728fe09a03c86bdd39df65a02a59784bfdd9f88c0e WatchSource:0}: Error finding container db7cc03a149ca657c23592728fe09a03c86bdd39df65a02a59784bfdd9f88c0e: Status 404 returned error can't find the container with id db7cc03a149ca657c23592728fe09a03c86bdd39df65a02a59784bfdd9f88c0e Apr 17 16:27:45.651008 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.650977 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5"] Apr 17 16:27:45.654442 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.654425 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.661583 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.661556 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5"] Apr 17 16:27:45.805505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.805410 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.805505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.805460 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.805505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.805483 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw7lv\" (UniqueName: \"kubernetes.io/projected/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-kube-api-access-fw7lv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.831039 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.831007 2569 generic.go:358] "Generic (PLEG): container finished" podID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerID="7f6d27d07346d11fd0f34a9456eb553cb3273cafbbb53f49ddb4c348e2f44121" exitCode=0 Apr 17 16:27:45.831205 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.831092 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" event={"ID":"2ee32c6e-ade8-495c-96a8-b71e6126eaec","Type":"ContainerDied","Data":"7f6d27d07346d11fd0f34a9456eb553cb3273cafbbb53f49ddb4c348e2f44121"} Apr 17 16:27:45.831205 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.831122 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" event={"ID":"2ee32c6e-ade8-495c-96a8-b71e6126eaec","Type":"ContainerStarted","Data":"db7cc03a149ca657c23592728fe09a03c86bdd39df65a02a59784bfdd9f88c0e"} Apr 17 16:27:45.906646 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.906606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.906860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.906673 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.906860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.906694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7lv\" (UniqueName: \"kubernetes.io/projected/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-kube-api-access-fw7lv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.907184 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.907142 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.907184 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.907150 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.914900 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.914875 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7lv\" (UniqueName: \"kubernetes.io/projected/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-kube-api-access-fw7lv\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:45.965618 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:45.965578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:46.086796 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.086770 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5"] Apr 17 16:27:46.088549 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:27:46.088524 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod742cf387_cfb5_4d5a_93ef_8a54c1b98ec7.slice/crio-4c4e1ea6a9b29a74135c2cc0d12c6d30895e2317d432fa3af4188b21aa72d4e7 WatchSource:0}: Error finding container 4c4e1ea6a9b29a74135c2cc0d12c6d30895e2317d432fa3af4188b21aa72d4e7: Status 404 returned error can't find the container with id 4c4e1ea6a9b29a74135c2cc0d12c6d30895e2317d432fa3af4188b21aa72d4e7 Apr 17 16:27:46.248277 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.248243 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn"] Apr 17 16:27:46.251742 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.251725 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.259324 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.259302 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn"] Apr 17 16:27:46.411681 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.411589 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.411681 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.411673 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.412169 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.411782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6sbc\" (UniqueName: \"kubernetes.io/projected/15631fe2-e551-42b9-a8df-6a77f70d7753-kube-api-access-h6sbc\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.512935 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.512894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.513069 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.512980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6sbc\" (UniqueName: \"kubernetes.io/projected/15631fe2-e551-42b9-a8df-6a77f70d7753-kube-api-access-h6sbc\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.513069 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.513025 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.513285 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.513264 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.513361 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.513343 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.521723 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.521696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6sbc\" (UniqueName: \"kubernetes.io/projected/15631fe2-e551-42b9-a8df-6a77f70d7753-kube-api-access-h6sbc\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.561880 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.561846 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:46.654793 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.654761 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x"] Apr 17 16:27:46.659006 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.658985 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.667834 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.667792 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x"] Apr 17 16:27:46.699018 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.698990 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn"] Apr 17 16:27:46.728265 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:27:46.728209 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15631fe2_e551_42b9_a8df_6a77f70d7753.slice/crio-3dc408d7415a1cb3cbcc7fbf27e65073e8fc505f20575023455f7fad391ad42b WatchSource:0}: Error finding container 3dc408d7415a1cb3cbcc7fbf27e65073e8fc505f20575023455f7fad391ad42b: Status 404 returned error can't find the container with id 3dc408d7415a1cb3cbcc7fbf27e65073e8fc505f20575023455f7fad391ad42b Apr 17 16:27:46.816344 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.816306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.816505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.816420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.816505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.816461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmcnj\" (UniqueName: \"kubernetes.io/projected/5d31e936-9d34-43bf-acdb-2912119ed690-kube-api-access-tmcnj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.840040 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.840008 2569 generic.go:358] "Generic (PLEG): container finished" podID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerID="7c5a9bae197bf5e21b2b9847da9c1515d77526b357a084caee915059d6325f56" exitCode=0 Apr 17 16:27:46.840197 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.840084 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" event={"ID":"2ee32c6e-ade8-495c-96a8-b71e6126eaec","Type":"ContainerDied","Data":"7c5a9bae197bf5e21b2b9847da9c1515d77526b357a084caee915059d6325f56"} Apr 17 16:27:46.841807 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.841779 2569 generic.go:358] "Generic (PLEG): container finished" podID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerID="a6028cca32aea582069684a7b791ddf796d73dc97eb360bf75cf25047cf1892c" exitCode=0 Apr 17 16:27:46.841928 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.841820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" event={"ID":"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7","Type":"ContainerDied","Data":"a6028cca32aea582069684a7b791ddf796d73dc97eb360bf75cf25047cf1892c"} Apr 17 16:27:46.841928 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.841852 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" event={"ID":"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7","Type":"ContainerStarted","Data":"4c4e1ea6a9b29a74135c2cc0d12c6d30895e2317d432fa3af4188b21aa72d4e7"} Apr 17 16:27:46.843389 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.843364 2569 generic.go:358] "Generic (PLEG): container finished" podID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerID="805387cbc9a3e1a16682cc28ca97b0acaa2ad873225f8a0484a3c5688473b931" exitCode=0 Apr 17 16:27:46.843473 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.843405 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" event={"ID":"15631fe2-e551-42b9-a8df-6a77f70d7753","Type":"ContainerDied","Data":"805387cbc9a3e1a16682cc28ca97b0acaa2ad873225f8a0484a3c5688473b931"} Apr 17 16:27:46.843473 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.843425 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" event={"ID":"15631fe2-e551-42b9-a8df-6a77f70d7753","Type":"ContainerStarted","Data":"3dc408d7415a1cb3cbcc7fbf27e65073e8fc505f20575023455f7fad391ad42b"} Apr 17 16:27:46.918149 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.918060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.918149 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.918115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmcnj\" (UniqueName: \"kubernetes.io/projected/5d31e936-9d34-43bf-acdb-2912119ed690-kube-api-access-tmcnj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.918447 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.918303 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.918507 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.918492 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.918694 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.918674 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:46.930376 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:46.930341 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmcnj\" (UniqueName: \"kubernetes.io/projected/5d31e936-9d34-43bf-acdb-2912119ed690-kube-api-access-tmcnj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:47.030711 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.030676 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:47.152628 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.152605 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x"] Apr 17 16:27:47.153974 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:27:47.153945 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d31e936_9d34_43bf_acdb_2912119ed690.slice/crio-cc21282e689578bb62e80c3e14cb66db706b4e4979aca03235952c78c79d9278 WatchSource:0}: Error finding container cc21282e689578bb62e80c3e14cb66db706b4e4979aca03235952c78c79d9278: Status 404 returned error can't find the container with id cc21282e689578bb62e80c3e14cb66db706b4e4979aca03235952c78c79d9278 Apr 17 16:27:47.850157 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.850123 2569 generic.go:358] "Generic (PLEG): container finished" podID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerID="985500c79a7dcd7cd9b5153fd848054c981b093532b690d51afca332e5567b0b" exitCode=0 Apr 17 16:27:47.850593 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.850211 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" event={"ID":"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7","Type":"ContainerDied","Data":"985500c79a7dcd7cd9b5153fd848054c981b093532b690d51afca332e5567b0b"} Apr 17 16:27:47.852087 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.852071 2569 generic.go:358] "Generic (PLEG): container finished" podID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerID="bc7f97778e98ee59b2d458f41f669ed427e5633b6bb240735261d8d289b5f465" exitCode=0 Apr 17 16:27:47.852171 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.852141 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" event={"ID":"15631fe2-e551-42b9-a8df-6a77f70d7753","Type":"ContainerDied","Data":"bc7f97778e98ee59b2d458f41f669ed427e5633b6bb240735261d8d289b5f465"} Apr 17 16:27:47.853742 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.853718 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d31e936-9d34-43bf-acdb-2912119ed690" containerID="dd5e76e2ca1fc0e9595444fbebf68a9be88b3b433e5c42f1218efedb2a4e5042" exitCode=0 Apr 17 16:27:47.853865 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.853800 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" event={"ID":"5d31e936-9d34-43bf-acdb-2912119ed690","Type":"ContainerDied","Data":"dd5e76e2ca1fc0e9595444fbebf68a9be88b3b433e5c42f1218efedb2a4e5042"} Apr 17 16:27:47.853865 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.853828 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" event={"ID":"5d31e936-9d34-43bf-acdb-2912119ed690","Type":"ContainerStarted","Data":"cc21282e689578bb62e80c3e14cb66db706b4e4979aca03235952c78c79d9278"} Apr 17 16:27:47.856591 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.856571 2569 generic.go:358] "Generic (PLEG): container finished" podID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerID="fcf8d0b45f4a2fc35ca3c03087cc3022aca70786d83141f6ec9e29546e56ac4a" exitCode=0 Apr 17 16:27:47.856679 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:47.856611 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" event={"ID":"2ee32c6e-ade8-495c-96a8-b71e6126eaec","Type":"ContainerDied","Data":"fcf8d0b45f4a2fc35ca3c03087cc3022aca70786d83141f6ec9e29546e56ac4a"} Apr 17 16:27:48.862126 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:48.862087 2569 generic.go:358] "Generic (PLEG): container finished" podID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerID="7cedffb4b3b40513e9d4c456c1c0bd9660f88bffcdacfd85606b9053430458f0" exitCode=0 Apr 17 16:27:48.862563 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:48.862169 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" event={"ID":"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7","Type":"ContainerDied","Data":"7cedffb4b3b40513e9d4c456c1c0bd9660f88bffcdacfd85606b9053430458f0"} Apr 17 16:27:48.863859 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:48.863837 2569 generic.go:358] "Generic (PLEG): container finished" podID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerID="0089789ee1eba6a5505690a77ea0aee7a1904a8a8bc2724b9d1b283e08e6049d" exitCode=0 Apr 17 16:27:48.863998 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:48.863907 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" event={"ID":"15631fe2-e551-42b9-a8df-6a77f70d7753","Type":"ContainerDied","Data":"0089789ee1eba6a5505690a77ea0aee7a1904a8a8bc2724b9d1b283e08e6049d"} Apr 17 16:27:48.865321 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:48.865302 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d31e936-9d34-43bf-acdb-2912119ed690" containerID="553529060f2b5612b90abed56cfa1230b3d0484b44bd9cacd2108de131aed667" exitCode=0 Apr 17 16:27:48.865452 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:48.865431 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" event={"ID":"5d31e936-9d34-43bf-acdb-2912119ed690","Type":"ContainerDied","Data":"553529060f2b5612b90abed56cfa1230b3d0484b44bd9cacd2108de131aed667"} Apr 17 16:27:48.989359 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:48.989333 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:49.139506 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.139418 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-util\") pod \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " Apr 17 16:27:49.139645 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.139522 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-bundle\") pod \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " Apr 17 16:27:49.139645 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.139555 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpqv7\" (UniqueName: \"kubernetes.io/projected/2ee32c6e-ade8-495c-96a8-b71e6126eaec-kube-api-access-bpqv7\") pod \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\" (UID: \"2ee32c6e-ade8-495c-96a8-b71e6126eaec\") " Apr 17 16:27:49.139994 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.139960 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-bundle" (OuterVolumeSpecName: "bundle") pod "2ee32c6e-ade8-495c-96a8-b71e6126eaec" (UID: "2ee32c6e-ade8-495c-96a8-b71e6126eaec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:27:49.141766 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.141742 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee32c6e-ade8-495c-96a8-b71e6126eaec-kube-api-access-bpqv7" (OuterVolumeSpecName: "kube-api-access-bpqv7") pod "2ee32c6e-ade8-495c-96a8-b71e6126eaec" (UID: "2ee32c6e-ade8-495c-96a8-b71e6126eaec"). InnerVolumeSpecName "kube-api-access-bpqv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:27:49.144734 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.144708 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-util" (OuterVolumeSpecName: "util") pod "2ee32c6e-ade8-495c-96a8-b71e6126eaec" (UID: "2ee32c6e-ade8-495c-96a8-b71e6126eaec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:27:49.240388 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.240347 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:49.240388 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.240385 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpqv7\" (UniqueName: \"kubernetes.io/projected/2ee32c6e-ade8-495c-96a8-b71e6126eaec-kube-api-access-bpqv7\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:49.240388 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.240397 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee32c6e-ade8-495c-96a8-b71e6126eaec-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:49.870531 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.870494 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d31e936-9d34-43bf-acdb-2912119ed690" containerID="286f6b53f353f67ad30bf78fc0f0c61ecbee994b8edb52c6f62a5dcaf3b96a54" exitCode=0 Apr 17 16:27:49.870995 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.870571 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" event={"ID":"5d31e936-9d34-43bf-acdb-2912119ed690","Type":"ContainerDied","Data":"286f6b53f353f67ad30bf78fc0f0c61ecbee994b8edb52c6f62a5dcaf3b96a54"} Apr 17 16:27:49.872183 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.872161 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" event={"ID":"2ee32c6e-ade8-495c-96a8-b71e6126eaec","Type":"ContainerDied","Data":"db7cc03a149ca657c23592728fe09a03c86bdd39df65a02a59784bfdd9f88c0e"} Apr 17 16:27:49.872292 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.872193 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db7cc03a149ca657c23592728fe09a03c86bdd39df65a02a59784bfdd9f88c0e" Apr 17 16:27:49.872292 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:49.872223 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562" Apr 17 16:27:50.025445 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.025421 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:50.029135 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.029116 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:50.147935 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.147844 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-bundle\") pod \"15631fe2-e551-42b9-a8df-6a77f70d7753\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " Apr 17 16:27:50.147935 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.147903 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6sbc\" (UniqueName: \"kubernetes.io/projected/15631fe2-e551-42b9-a8df-6a77f70d7753-kube-api-access-h6sbc\") pod \"15631fe2-e551-42b9-a8df-6a77f70d7753\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " Apr 17 16:27:50.147935 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.147929 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw7lv\" (UniqueName: \"kubernetes.io/projected/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-kube-api-access-fw7lv\") pod \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " Apr 17 16:27:50.148223 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.147956 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-util\") pod \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " Apr 17 16:27:50.148223 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.147972 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-util\") pod \"15631fe2-e551-42b9-a8df-6a77f70d7753\" (UID: \"15631fe2-e551-42b9-a8df-6a77f70d7753\") " Apr 17 16:27:50.148223 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.148008 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-bundle\") pod \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\" (UID: \"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7\") " Apr 17 16:27:50.148542 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.148508 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-bundle" (OuterVolumeSpecName: "bundle") pod "15631fe2-e551-42b9-a8df-6a77f70d7753" (UID: "15631fe2-e551-42b9-a8df-6a77f70d7753"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:27:50.148846 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.148815 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-bundle" (OuterVolumeSpecName: "bundle") pod "742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" (UID: "742cf387-cfb5-4d5a-93ef-8a54c1b98ec7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:27:50.150193 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.150164 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15631fe2-e551-42b9-a8df-6a77f70d7753-kube-api-access-h6sbc" (OuterVolumeSpecName: "kube-api-access-h6sbc") pod "15631fe2-e551-42b9-a8df-6a77f70d7753" (UID: "15631fe2-e551-42b9-a8df-6a77f70d7753"). InnerVolumeSpecName "kube-api-access-h6sbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:27:50.150492 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.150475 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-kube-api-access-fw7lv" (OuterVolumeSpecName: "kube-api-access-fw7lv") pod "742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" (UID: "742cf387-cfb5-4d5a-93ef-8a54c1b98ec7"). InnerVolumeSpecName "kube-api-access-fw7lv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:27:50.153554 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.153532 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-util" (OuterVolumeSpecName: "util") pod "742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" (UID: "742cf387-cfb5-4d5a-93ef-8a54c1b98ec7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:27:50.153928 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.153908 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-util" (OuterVolumeSpecName: "util") pod "15631fe2-e551-42b9-a8df-6a77f70d7753" (UID: "15631fe2-e551-42b9-a8df-6a77f70d7753"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:27:50.249124 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.249102 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:50.249124 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.249125 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6sbc\" (UniqueName: \"kubernetes.io/projected/15631fe2-e551-42b9-a8df-6a77f70d7753-kube-api-access-h6sbc\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:50.249267 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.249136 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fw7lv\" (UniqueName: \"kubernetes.io/projected/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-kube-api-access-fw7lv\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:50.249267 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.249145 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:50.249267 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.249154 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/15631fe2-e551-42b9-a8df-6a77f70d7753-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:50.249267 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.249162 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/742cf387-cfb5-4d5a-93ef-8a54c1b98ec7-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:50.877689 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.877657 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" Apr 17 16:27:50.877689 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.877662 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5" event={"ID":"742cf387-cfb5-4d5a-93ef-8a54c1b98ec7","Type":"ContainerDied","Data":"4c4e1ea6a9b29a74135c2cc0d12c6d30895e2317d432fa3af4188b21aa72d4e7"} Apr 17 16:27:50.877689 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.877695 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c4e1ea6a9b29a74135c2cc0d12c6d30895e2317d432fa3af4188b21aa72d4e7" Apr 17 16:27:50.879395 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.879370 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" event={"ID":"15631fe2-e551-42b9-a8df-6a77f70d7753","Type":"ContainerDied","Data":"3dc408d7415a1cb3cbcc7fbf27e65073e8fc505f20575023455f7fad391ad42b"} Apr 17 16:27:50.879395 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.879391 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn" Apr 17 16:27:50.879395 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:50.879398 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dc408d7415a1cb3cbcc7fbf27e65073e8fc505f20575023455f7fad391ad42b" Apr 17 16:27:51.005501 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.005478 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:51.156888 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.156793 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-bundle\") pod \"5d31e936-9d34-43bf-acdb-2912119ed690\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " Apr 17 16:27:51.157064 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.156892 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmcnj\" (UniqueName: \"kubernetes.io/projected/5d31e936-9d34-43bf-acdb-2912119ed690-kube-api-access-tmcnj\") pod \"5d31e936-9d34-43bf-acdb-2912119ed690\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " Apr 17 16:27:51.157064 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.156939 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-util\") pod \"5d31e936-9d34-43bf-acdb-2912119ed690\" (UID: \"5d31e936-9d34-43bf-acdb-2912119ed690\") " Apr 17 16:27:51.157476 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.157444 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-bundle" (OuterVolumeSpecName: "bundle") pod "5d31e936-9d34-43bf-acdb-2912119ed690" (UID: "5d31e936-9d34-43bf-acdb-2912119ed690"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:27:51.159000 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.158976 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d31e936-9d34-43bf-acdb-2912119ed690-kube-api-access-tmcnj" (OuterVolumeSpecName: "kube-api-access-tmcnj") pod "5d31e936-9d34-43bf-acdb-2912119ed690" (UID: "5d31e936-9d34-43bf-acdb-2912119ed690"). InnerVolumeSpecName "kube-api-access-tmcnj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:27:51.162200 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.162169 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-util" (OuterVolumeSpecName: "util") pod "5d31e936-9d34-43bf-acdb-2912119ed690" (UID: "5d31e936-9d34-43bf-acdb-2912119ed690"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:27:51.258026 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.257990 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-util\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:51.258026 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.258017 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d31e936-9d34-43bf-acdb-2912119ed690-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:51.258026 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.258028 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmcnj\" (UniqueName: \"kubernetes.io/projected/5d31e936-9d34-43bf-acdb-2912119ed690-kube-api-access-tmcnj\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:27:51.884606 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.884575 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" Apr 17 16:27:51.884606 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.884587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x" event={"ID":"5d31e936-9d34-43bf-acdb-2912119ed690","Type":"ContainerDied","Data":"cc21282e689578bb62e80c3e14cb66db706b4e4979aca03235952c78c79d9278"} Apr 17 16:27:51.885023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:51.884623 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc21282e689578bb62e80c3e14cb66db706b4e4979aca03235952c78c79d9278" Apr 17 16:27:54.573965 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:54.573925 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:54.574310 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:54.573983 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:54.578810 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:54.578786 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:54.904083 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:54.904009 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f47b6d896-zckpn" Apr 17 16:27:54.950933 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:27:54.950900 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f87b67c4b-29cns"] Apr 17 16:28:02.980205 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980160 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6"] Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980543 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d31e936-9d34-43bf-acdb-2912119ed690" containerName="pull" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980554 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d31e936-9d34-43bf-acdb-2912119ed690" containerName="pull" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980563 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerName="pull" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980571 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerName="pull" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980579 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerName="util" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980584 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerName="util" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980594 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerName="util" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980599 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerName="util" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980608 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d31e936-9d34-43bf-acdb-2912119ed690" containerName="extract" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980615 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d31e936-9d34-43bf-acdb-2912119ed690" containerName="extract" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980626 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerName="extract" Apr 17 16:28:02.980632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980634 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerName="extract" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980643 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerName="extract" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980648 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerName="extract" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980658 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerName="util" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980663 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerName="util" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980669 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerName="pull" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980673 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerName="pull" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980679 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerName="pull" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980684 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerName="pull" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980693 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d31e936-9d34-43bf-acdb-2912119ed690" containerName="util" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980698 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d31e936-9d34-43bf-acdb-2912119ed690" containerName="util" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980706 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerName="extract" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980711 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerName="extract" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980766 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="742cf387-cfb5-4d5a-93ef-8a54c1b98ec7" containerName="extract" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980775 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d31e936-9d34-43bf-acdb-2912119ed690" containerName="extract" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980784 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ee32c6e-ade8-495c-96a8-b71e6126eaec" containerName="extract" Apr 17 16:28:02.981023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.980790 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="15631fe2-e551-42b9-a8df-6a77f70d7753" containerName="extract" Apr 17 16:28:02.983145 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.983128 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:02.986178 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.986159 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-k7xgk\"" Apr 17 16:28:02.998343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:02.998317 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6"] Apr 17 16:28:03.059981 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.059952 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6xgb\" (UniqueName: \"kubernetes.io/projected/750be774-3215-4390-bf85-b605cabecea2-kube-api-access-q6xgb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" (UID: \"750be774-3215-4390-bf85-b605cabecea2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:03.060203 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.060071 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/750be774-3215-4390-bf85-b605cabecea2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" (UID: \"750be774-3215-4390-bf85-b605cabecea2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:03.161381 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.161339 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/750be774-3215-4390-bf85-b605cabecea2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" (UID: \"750be774-3215-4390-bf85-b605cabecea2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:03.161536 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.161410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6xgb\" (UniqueName: \"kubernetes.io/projected/750be774-3215-4390-bf85-b605cabecea2-kube-api-access-q6xgb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" (UID: \"750be774-3215-4390-bf85-b605cabecea2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:03.161744 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.161718 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/750be774-3215-4390-bf85-b605cabecea2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" (UID: \"750be774-3215-4390-bf85-b605cabecea2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:03.170648 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.170622 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6xgb\" (UniqueName: \"kubernetes.io/projected/750be774-3215-4390-bf85-b605cabecea2-kube-api-access-q6xgb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" (UID: \"750be774-3215-4390-bf85-b605cabecea2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:03.293631 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.293544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:03.422595 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.422568 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6"] Apr 17 16:28:03.424522 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:28:03.424490 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod750be774_3215_4390_bf85_b605cabecea2.slice/crio-5839d69afab83ed8caef99e5b2802efaa95486863d7d86859902850e95ba935c WatchSource:0}: Error finding container 5839d69afab83ed8caef99e5b2802efaa95486863d7d86859902850e95ba935c: Status 404 returned error can't find the container with id 5839d69afab83ed8caef99e5b2802efaa95486863d7d86859902850e95ba935c Apr 17 16:28:03.934455 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:03.934411 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" event={"ID":"750be774-3215-4390-bf85-b605cabecea2","Type":"ContainerStarted","Data":"5839d69afab83ed8caef99e5b2802efaa95486863d7d86859902850e95ba935c"} Apr 17 16:28:08.955498 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:08.955398 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" event={"ID":"750be774-3215-4390-bf85-b605cabecea2","Type":"ContainerStarted","Data":"7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21"} Apr 17 16:28:08.955498 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:08.955443 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:08.985244 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:08.985173 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" podStartSLOduration=1.739666841 podStartE2EDuration="6.985158012s" podCreationTimestamp="2026-04-17 16:28:02 +0000 UTC" firstStartedPulling="2026-04-17 16:28:03.426929131 +0000 UTC m=+469.807557472" lastFinishedPulling="2026-04-17 16:28:08.672420304 +0000 UTC m=+475.053048643" observedRunningTime="2026-04-17 16:28:08.982533216 +0000 UTC m=+475.363161577" watchObservedRunningTime="2026-04-17 16:28:08.985158012 +0000 UTC m=+475.365786373" Apr 17 16:28:10.622096 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:10.622061 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v"] Apr 17 16:28:10.625653 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:10.625635 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:10.628031 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:10.627997 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-q2z79\"" Apr 17 16:28:10.636305 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:10.636281 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v"] Apr 17 16:28:10.730814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:10.730782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljckp\" (UniqueName: \"kubernetes.io/projected/aee15d93-19f6-4c69-9af7-60f6cfe1dd4f-kube-api-access-ljckp\") pod \"limitador-operator-controller-manager-85c4996f8c-sft6v\" (UID: \"aee15d93-19f6-4c69-9af7-60f6cfe1dd4f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:10.831580 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:10.831548 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljckp\" (UniqueName: \"kubernetes.io/projected/aee15d93-19f6-4c69-9af7-60f6cfe1dd4f-kube-api-access-ljckp\") pod \"limitador-operator-controller-manager-85c4996f8c-sft6v\" (UID: \"aee15d93-19f6-4c69-9af7-60f6cfe1dd4f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:10.842587 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:10.842561 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljckp\" (UniqueName: \"kubernetes.io/projected/aee15d93-19f6-4c69-9af7-60f6cfe1dd4f-kube-api-access-ljckp\") pod \"limitador-operator-controller-manager-85c4996f8c-sft6v\" (UID: \"aee15d93-19f6-4c69-9af7-60f6cfe1dd4f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:10.936918 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:10.936832 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:11.074469 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:11.074446 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v"] Apr 17 16:28:11.076379 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:28:11.076351 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee15d93_19f6_4c69_9af7_60f6cfe1dd4f.slice/crio-fbcf5c6375f98aca2c940ddb938371a6ab819be2f9a1f582bd3b148978c08358 WatchSource:0}: Error finding container fbcf5c6375f98aca2c940ddb938371a6ab819be2f9a1f582bd3b148978c08358: Status 404 returned error can't find the container with id fbcf5c6375f98aca2c940ddb938371a6ab819be2f9a1f582bd3b148978c08358 Apr 17 16:28:11.969023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:11.968974 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" event={"ID":"aee15d93-19f6-4c69-9af7-60f6cfe1dd4f","Type":"ContainerStarted","Data":"fbcf5c6375f98aca2c940ddb938371a6ab819be2f9a1f582bd3b148978c08358"} Apr 17 16:28:13.978246 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:13.978195 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" event={"ID":"aee15d93-19f6-4c69-9af7-60f6cfe1dd4f","Type":"ContainerStarted","Data":"b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4"} Apr 17 16:28:13.978637 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:13.978336 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:13.999192 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:13.999142 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" podStartSLOduration=2.135097199 podStartE2EDuration="3.999131397s" podCreationTimestamp="2026-04-17 16:28:10 +0000 UTC" firstStartedPulling="2026-04-17 16:28:11.078216369 +0000 UTC m=+477.458844709" lastFinishedPulling="2026-04-17 16:28:12.942250553 +0000 UTC m=+479.322878907" observedRunningTime="2026-04-17 16:28:13.997464622 +0000 UTC m=+480.378092986" watchObservedRunningTime="2026-04-17 16:28:13.999131397 +0000 UTC m=+480.379759757" Apr 17 16:28:19.961360 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:19.961323 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:19.974797 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:19.974764 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f87b67c4b-29cns" podUID="2cb33870-85ad-4318-b170-0110affd63f8" containerName="console" containerID="cri-o://1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa" gracePeriod=15 Apr 17 16:28:20.210936 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.210909 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f87b67c4b-29cns_2cb33870-85ad-4318-b170-0110affd63f8/console/0.log" Apr 17 16:28:20.211060 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.210970 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:28:20.320895 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.320864 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-oauth-serving-cert\") pod \"2cb33870-85ad-4318-b170-0110affd63f8\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " Apr 17 16:28:20.320895 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.320899 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-serving-cert\") pod \"2cb33870-85ad-4318-b170-0110affd63f8\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " Apr 17 16:28:20.321151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.320921 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-service-ca\") pod \"2cb33870-85ad-4318-b170-0110affd63f8\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " Apr 17 16:28:20.321151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.320939 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-oauth-config\") pod \"2cb33870-85ad-4318-b170-0110affd63f8\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " Apr 17 16:28:20.321151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.320957 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhz4v\" (UniqueName: \"kubernetes.io/projected/2cb33870-85ad-4318-b170-0110affd63f8-kube-api-access-fhz4v\") pod \"2cb33870-85ad-4318-b170-0110affd63f8\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " Apr 17 16:28:20.321151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.321054 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-trusted-ca-bundle\") pod \"2cb33870-85ad-4318-b170-0110affd63f8\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " Apr 17 16:28:20.321151 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.321085 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-console-config\") pod \"2cb33870-85ad-4318-b170-0110affd63f8\" (UID: \"2cb33870-85ad-4318-b170-0110affd63f8\") " Apr 17 16:28:20.321417 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.321308 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2cb33870-85ad-4318-b170-0110affd63f8" (UID: "2cb33870-85ad-4318-b170-0110affd63f8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:28:20.321474 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.321437 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-oauth-serving-cert\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:20.321528 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.321483 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-service-ca" (OuterVolumeSpecName: "service-ca") pod "2cb33870-85ad-4318-b170-0110affd63f8" (UID: "2cb33870-85ad-4318-b170-0110affd63f8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:28:20.321584 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.321511 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2cb33870-85ad-4318-b170-0110affd63f8" (UID: "2cb33870-85ad-4318-b170-0110affd63f8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:28:20.322047 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.322020 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-console-config" (OuterVolumeSpecName: "console-config") pod "2cb33870-85ad-4318-b170-0110affd63f8" (UID: "2cb33870-85ad-4318-b170-0110affd63f8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:28:20.323324 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.323298 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2cb33870-85ad-4318-b170-0110affd63f8" (UID: "2cb33870-85ad-4318-b170-0110affd63f8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:28:20.323428 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.323393 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2cb33870-85ad-4318-b170-0110affd63f8" (UID: "2cb33870-85ad-4318-b170-0110affd63f8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:28:20.323548 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.323519 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb33870-85ad-4318-b170-0110affd63f8-kube-api-access-fhz4v" (OuterVolumeSpecName: "kube-api-access-fhz4v") pod "2cb33870-85ad-4318-b170-0110affd63f8" (UID: "2cb33870-85ad-4318-b170-0110affd63f8"). InnerVolumeSpecName "kube-api-access-fhz4v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:28:20.422423 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.422383 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-trusted-ca-bundle\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:20.422423 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.422421 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-console-config\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:20.422423 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.422433 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-serving-cert\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:20.422657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.422443 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cb33870-85ad-4318-b170-0110affd63f8-service-ca\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:20.422657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.422452 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cb33870-85ad-4318-b170-0110affd63f8-console-oauth-config\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:20.422657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.422463 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhz4v\" (UniqueName: \"kubernetes.io/projected/2cb33870-85ad-4318-b170-0110affd63f8-kube-api-access-fhz4v\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:20.996442 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.996404 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6"] Apr 17 16:28:20.996835 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:20.996613 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" podUID="750be774-3215-4390-bf85-b605cabecea2" containerName="manager" containerID="cri-o://7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21" gracePeriod=2 Apr 17 16:28:21.007697 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.007667 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6"] Apr 17 16:28:21.010386 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.010348 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f87b67c4b-29cns_2cb33870-85ad-4318-b170-0110affd63f8/console/0.log" Apr 17 16:28:21.010505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.010403 2569 generic.go:358] "Generic (PLEG): container finished" podID="2cb33870-85ad-4318-b170-0110affd63f8" containerID="1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa" exitCode=2 Apr 17 16:28:21.010505 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.010499 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f87b67c4b-29cns" Apr 17 16:28:21.010635 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.010548 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f87b67c4b-29cns" event={"ID":"2cb33870-85ad-4318-b170-0110affd63f8","Type":"ContainerDied","Data":"1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa"} Apr 17 16:28:21.010635 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.010585 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f87b67c4b-29cns" event={"ID":"2cb33870-85ad-4318-b170-0110affd63f8","Type":"ContainerDied","Data":"36318d0f607eac48156bf8c2a43d08cf4cbeb4ae759a084f73748064296b858f"} Apr 17 16:28:21.010635 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.010605 2569 scope.go:117] "RemoveContainer" containerID="1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa" Apr 17 16:28:21.019393 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.019366 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb"] Apr 17 16:28:21.019788 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.019768 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="750be774-3215-4390-bf85-b605cabecea2" containerName="manager" Apr 17 16:28:21.019788 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.019783 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="750be774-3215-4390-bf85-b605cabecea2" containerName="manager" Apr 17 16:28:21.019875 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.019799 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cb33870-85ad-4318-b170-0110affd63f8" containerName="console" Apr 17 16:28:21.019875 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.019805 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb33870-85ad-4318-b170-0110affd63f8" containerName="console" Apr 17 16:28:21.019941 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.019877 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cb33870-85ad-4318-b170-0110affd63f8" containerName="console" Apr 17 16:28:21.019941 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.019888 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="750be774-3215-4390-bf85-b605cabecea2" containerName="manager" Apr 17 16:28:21.023048 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.023033 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:21.023569 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.023542 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.026844 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.026811 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.028632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.028611 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v"] Apr 17 16:28:21.028834 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.028814 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" containerName="manager" containerID="cri-o://b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4" gracePeriod=2 Apr 17 16:28:21.030741 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.030721 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:21.039875 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.039813 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb"] Apr 17 16:28:21.046256 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.046215 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v"] Apr 17 16:28:21.060434 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.060402 2569 status_manager.go:895] "Failed to get status for pod" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" err="pods \"limitador-operator-controller-manager-85c4996f8c-sft6v\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.062512 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.062482 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.113513 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.113489 2569 scope.go:117] "RemoveContainer" containerID="1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa" Apr 17 16:28:21.113915 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:28:21.113891 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa\": container with ID starting with 1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa not found: ID does not exist" containerID="1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa" Apr 17 16:28:21.114011 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.113922 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa"} err="failed to get container status \"1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa\": rpc error: code = NotFound desc = could not find container \"1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa\": container with ID starting with 1d23e2617d3e64e4fe32224a286692ee906c5fbb957dfde5c9c1c1483bd0b8aa not found: ID does not exist" Apr 17 16:28:21.117410 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.117375 2569 status_manager.go:895] "Failed to get status for pod" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" err="pods \"limitador-operator-controller-manager-85c4996f8c-sft6v\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.119456 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.119431 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.129753 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.129723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p8fkb\" (UID: \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:21.129882 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.129764 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhl77\" (UniqueName: \"kubernetes.io/projected/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-kube-api-access-hhl77\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p8fkb\" (UID: \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:21.134995 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.134968 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f87b67c4b-29cns"] Apr 17 16:28:21.140973 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.140943 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f87b67c4b-29cns"] Apr 17 16:28:21.231031 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.230995 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p8fkb\" (UID: \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:21.231215 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.231049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhl77\" (UniqueName: \"kubernetes.io/projected/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-kube-api-access-hhl77\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p8fkb\" (UID: \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:21.231396 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.231371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p8fkb\" (UID: \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:21.241738 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.241709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhl77\" (UniqueName: \"kubernetes.io/projected/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-kube-api-access-hhl77\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p8fkb\" (UID: \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:21.271082 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.271059 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:21.273599 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.273564 2569 status_manager.go:895] "Failed to get status for pod" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" err="pods \"limitador-operator-controller-manager-85c4996f8c-sft6v\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.274332 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.274315 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:21.275647 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.275625 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.277672 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.277652 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.279546 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.279529 2569 status_manager.go:895] "Failed to get status for pod" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" err="pods \"limitador-operator-controller-manager-85c4996f8c-sft6v\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.332110 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.332076 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6xgb\" (UniqueName: \"kubernetes.io/projected/750be774-3215-4390-bf85-b605cabecea2-kube-api-access-q6xgb\") pod \"750be774-3215-4390-bf85-b605cabecea2\" (UID: \"750be774-3215-4390-bf85-b605cabecea2\") " Apr 17 16:28:21.332313 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.332158 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljckp\" (UniqueName: \"kubernetes.io/projected/aee15d93-19f6-4c69-9af7-60f6cfe1dd4f-kube-api-access-ljckp\") pod \"aee15d93-19f6-4c69-9af7-60f6cfe1dd4f\" (UID: \"aee15d93-19f6-4c69-9af7-60f6cfe1dd4f\") " Apr 17 16:28:21.332313 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.332197 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/750be774-3215-4390-bf85-b605cabecea2-extensions-socket-volume\") pod \"750be774-3215-4390-bf85-b605cabecea2\" (UID: \"750be774-3215-4390-bf85-b605cabecea2\") " Apr 17 16:28:21.332617 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.332593 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750be774-3215-4390-bf85-b605cabecea2-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "750be774-3215-4390-bf85-b605cabecea2" (UID: "750be774-3215-4390-bf85-b605cabecea2"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:28:21.334243 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.334209 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750be774-3215-4390-bf85-b605cabecea2-kube-api-access-q6xgb" (OuterVolumeSpecName: "kube-api-access-q6xgb") pod "750be774-3215-4390-bf85-b605cabecea2" (UID: "750be774-3215-4390-bf85-b605cabecea2"). InnerVolumeSpecName "kube-api-access-q6xgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:28:21.334343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.334278 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee15d93-19f6-4c69-9af7-60f6cfe1dd4f-kube-api-access-ljckp" (OuterVolumeSpecName: "kube-api-access-ljckp") pod "aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" (UID: "aee15d93-19f6-4c69-9af7-60f6cfe1dd4f"). InnerVolumeSpecName "kube-api-access-ljckp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:28:21.421373 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.421336 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:21.433548 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.433524 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ljckp\" (UniqueName: \"kubernetes.io/projected/aee15d93-19f6-4c69-9af7-60f6cfe1dd4f-kube-api-access-ljckp\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:21.433654 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.433550 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/750be774-3215-4390-bf85-b605cabecea2-extensions-socket-volume\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:21.433654 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.433563 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q6xgb\" (UniqueName: \"kubernetes.io/projected/750be774-3215-4390-bf85-b605cabecea2-kube-api-access-q6xgb\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:21.553955 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.553877 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb"] Apr 17 16:28:21.557392 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:28:21.557360 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c57e9f8_643a_4563_ad48_d4f2e138a3c8.slice/crio-4a65d2ef5fee8bc1f8855007572c71d22f99b6134c5676b140a1de8139d6fe53 WatchSource:0}: Error finding container 4a65d2ef5fee8bc1f8855007572c71d22f99b6134c5676b140a1de8139d6fe53: Status 404 returned error can't find the container with id 4a65d2ef5fee8bc1f8855007572c71d22f99b6134c5676b140a1de8139d6fe53 Apr 17 16:28:21.958997 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.958961 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97"] Apr 17 16:28:21.959376 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.959359 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" containerName="manager" Apr 17 16:28:21.959421 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.959378 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" containerName="manager" Apr 17 16:28:21.959466 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.959455 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" containerName="manager" Apr 17 16:28:21.962752 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.962727 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:21.968516 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.968488 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:21.979350 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:21.979323 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97"] Apr 17 16:28:22.005113 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.005074 2569 status_manager.go:895] "Failed to get status for pod" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" err="pods \"limitador-operator-controller-manager-85c4996f8c-sft6v\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:22.022849 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.022768 2569 generic.go:358] "Generic (PLEG): container finished" podID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" containerID="b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4" exitCode=0 Apr 17 16:28:22.023018 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.022852 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" Apr 17 16:28:22.023018 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.022860 2569 scope.go:117] "RemoveContainer" containerID="b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4" Apr 17 16:28:22.024542 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.024443 2569 generic.go:358] "Generic (PLEG): container finished" podID="750be774-3215-4390-bf85-b605cabecea2" containerID="7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21" exitCode=0 Apr 17 16:28:22.024542 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.024513 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" Apr 17 16:28:22.025982 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.025952 2569 status_manager.go:895] "Failed to get status for pod" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" err="pods \"limitador-operator-controller-manager-85c4996f8c-sft6v\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:22.026114 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.026066 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" event={"ID":"5c57e9f8-643a-4563-ad48-d4f2e138a3c8","Type":"ContainerStarted","Data":"8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0"} Apr 17 16:28:22.026114 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.026096 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" event={"ID":"5c57e9f8-643a-4563-ad48-d4f2e138a3c8","Type":"ContainerStarted","Data":"4a65d2ef5fee8bc1f8855007572c71d22f99b6134c5676b140a1de8139d6fe53"} Apr 17 16:28:22.026258 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.026216 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:22.028367 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.028339 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:22.031766 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.031736 2569 status_manager.go:895] "Failed to get status for pod" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" err="pods \"limitador-operator-controller-manager-85c4996f8c-sft6v\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:22.033894 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.033871 2569 scope.go:117] "RemoveContainer" containerID="b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4" Apr 17 16:28:22.034188 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:28:22.034168 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4\": container with ID starting with b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4 not found: ID does not exist" containerID="b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4" Apr 17 16:28:22.034301 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.034200 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4"} err="failed to get container status \"b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4\": rpc error: code = NotFound desc = could not find container \"b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4\": container with ID starting with b3390b1b1ceb8a6833bb9456c4164681fcd87727e55503ebcbe0f2da08d6a3f4 not found: ID does not exist" Apr 17 16:28:22.034301 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.034245 2569 scope.go:117] "RemoveContainer" containerID="7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21" Apr 17 16:28:22.039396 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.039370 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb4a0194-736b-4b69-ab5a-32731bbecb8c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-shv97\" (UID: \"eb4a0194-736b-4b69-ab5a-32731bbecb8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:22.039485 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.039467 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrqg\" (UniqueName: \"kubernetes.io/projected/eb4a0194-736b-4b69-ab5a-32731bbecb8c-kube-api-access-dlrqg\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-shv97\" (UID: \"eb4a0194-736b-4b69-ab5a-32731bbecb8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:22.042675 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.042660 2569 scope.go:117] "RemoveContainer" containerID="7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21" Apr 17 16:28:22.042940 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:28:22.042921 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21\": container with ID starting with 7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21 not found: ID does not exist" containerID="7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21" Apr 17 16:28:22.042986 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.042948 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21"} err="failed to get container status \"7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21\": rpc error: code = NotFound desc = could not find container \"7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21\": container with ID starting with 7956b557200b8c5c6f6f046d696f2a29d5d6ae978af5b7b4f1f44750ce338c21 not found: ID does not exist" Apr 17 16:28:22.064886 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.064846 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:22.066015 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.065979 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" podStartSLOduration=2.06596743 podStartE2EDuration="2.06596743s" podCreationTimestamp="2026-04-17 16:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:28:22.062281002 +0000 UTC m=+488.442909364" watchObservedRunningTime="2026-04-17 16:28:22.06596743 +0000 UTC m=+488.446595790" Apr 17 16:28:22.067239 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.067204 2569 status_manager.go:895] "Failed to get status for pod" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-sft6v" err="pods \"limitador-operator-controller-manager-85c4996f8c-sft6v\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:22.069448 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.069429 2569 status_manager.go:895] "Failed to get status for pod" podUID="750be774-3215-4390-bf85-b605cabecea2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5kzq6" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5kzq6\" is forbidden: User \"system:node:ip-10-0-136-214.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-214.ec2.internal' and this object" Apr 17 16:28:22.140754 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.140716 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrqg\" (UniqueName: \"kubernetes.io/projected/eb4a0194-736b-4b69-ab5a-32731bbecb8c-kube-api-access-dlrqg\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-shv97\" (UID: \"eb4a0194-736b-4b69-ab5a-32731bbecb8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:22.140927 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.140783 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb4a0194-736b-4b69-ab5a-32731bbecb8c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-shv97\" (UID: \"eb4a0194-736b-4b69-ab5a-32731bbecb8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:22.141113 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.141097 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb4a0194-736b-4b69-ab5a-32731bbecb8c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-shv97\" (UID: \"eb4a0194-736b-4b69-ab5a-32731bbecb8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:22.153839 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.153810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrqg\" (UniqueName: \"kubernetes.io/projected/eb4a0194-736b-4b69-ab5a-32731bbecb8c-kube-api-access-dlrqg\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-shv97\" (UID: \"eb4a0194-736b-4b69-ab5a-32731bbecb8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:22.218293 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.218198 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb33870-85ad-4318-b170-0110affd63f8" path="/var/lib/kubelet/pods/2cb33870-85ad-4318-b170-0110affd63f8/volumes" Apr 17 16:28:22.218598 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.218583 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750be774-3215-4390-bf85-b605cabecea2" path="/var/lib/kubelet/pods/750be774-3215-4390-bf85-b605cabecea2/volumes" Apr 17 16:28:22.218864 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.218852 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee15d93-19f6-4c69-9af7-60f6cfe1dd4f" path="/var/lib/kubelet/pods/aee15d93-19f6-4c69-9af7-60f6cfe1dd4f/volumes" Apr 17 16:28:22.272119 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.272079 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:22.419923 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:22.419892 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97"] Apr 17 16:28:22.422560 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:28:22.422527 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4a0194_736b_4b69_ab5a_32731bbecb8c.slice/crio-b2e650e030a1e603e5e1e0591f7d626fb1f8a0ba0164af4c9c1eef49c89d24b5 WatchSource:0}: Error finding container b2e650e030a1e603e5e1e0591f7d626fb1f8a0ba0164af4c9c1eef49c89d24b5: Status 404 returned error can't find the container with id b2e650e030a1e603e5e1e0591f7d626fb1f8a0ba0164af4c9c1eef49c89d24b5 Apr 17 16:28:23.036878 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:23.036831 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" event={"ID":"eb4a0194-736b-4b69-ab5a-32731bbecb8c","Type":"ContainerStarted","Data":"ee3f2f0b453397d47798b52eb771a44704b2d15feaed4aa6f89200cacf607bef"} Apr 17 16:28:23.036878 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:23.036876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" event={"ID":"eb4a0194-736b-4b69-ab5a-32731bbecb8c","Type":"ContainerStarted","Data":"b2e650e030a1e603e5e1e0591f7d626fb1f8a0ba0164af4c9c1eef49c89d24b5"} Apr 17 16:28:23.037342 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:23.036922 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:23.057828 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:23.057777 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" podStartSLOduration=2.057762305 podStartE2EDuration="2.057762305s" podCreationTimestamp="2026-04-17 16:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:28:23.054322693 +0000 UTC m=+489.434951054" watchObservedRunningTime="2026-04-17 16:28:23.057762305 +0000 UTC m=+489.438390666" Apr 17 16:28:33.039048 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:33.039003 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:34.043001 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.042964 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-shv97" Apr 17 16:28:34.123466 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.123430 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb"] Apr 17 16:28:34.123703 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.123675 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" podUID="5c57e9f8-643a-4563-ad48-d4f2e138a3c8" containerName="manager" containerID="cri-o://8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0" gracePeriod=10 Apr 17 16:28:34.372007 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.371983 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:34.453832 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.453804 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-extensions-socket-volume\") pod \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\" (UID: \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\") " Apr 17 16:28:34.454014 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.453847 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhl77\" (UniqueName: \"kubernetes.io/projected/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-kube-api-access-hhl77\") pod \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\" (UID: \"5c57e9f8-643a-4563-ad48-d4f2e138a3c8\") " Apr 17 16:28:34.454247 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.454194 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "5c57e9f8-643a-4563-ad48-d4f2e138a3c8" (UID: "5c57e9f8-643a-4563-ad48-d4f2e138a3c8"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:28:34.455933 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.455906 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-kube-api-access-hhl77" (OuterVolumeSpecName: "kube-api-access-hhl77") pod "5c57e9f8-643a-4563-ad48-d4f2e138a3c8" (UID: "5c57e9f8-643a-4563-ad48-d4f2e138a3c8"). InnerVolumeSpecName "kube-api-access-hhl77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:28:34.555188 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.555156 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-extensions-socket-volume\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:34.555188 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:34.555182 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhl77\" (UniqueName: \"kubernetes.io/projected/5c57e9f8-643a-4563-ad48-d4f2e138a3c8-kube-api-access-hhl77\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:28:35.092904 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.092863 2569 generic.go:358] "Generic (PLEG): container finished" podID="5c57e9f8-643a-4563-ad48-d4f2e138a3c8" containerID="8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0" exitCode=0 Apr 17 16:28:35.093374 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.092924 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" Apr 17 16:28:35.093374 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.092948 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" event={"ID":"5c57e9f8-643a-4563-ad48-d4f2e138a3c8","Type":"ContainerDied","Data":"8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0"} Apr 17 16:28:35.093374 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.092982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb" event={"ID":"5c57e9f8-643a-4563-ad48-d4f2e138a3c8","Type":"ContainerDied","Data":"4a65d2ef5fee8bc1f8855007572c71d22f99b6134c5676b140a1de8139d6fe53"} Apr 17 16:28:35.093374 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.092999 2569 scope.go:117] "RemoveContainer" containerID="8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0" Apr 17 16:28:35.102075 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.102058 2569 scope.go:117] "RemoveContainer" containerID="8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0" Apr 17 16:28:35.102321 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:28:35.102295 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0\": container with ID starting with 8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0 not found: ID does not exist" containerID="8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0" Apr 17 16:28:35.102418 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.102323 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0"} err="failed to get container status \"8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0\": rpc error: code = NotFound desc = could not find container \"8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0\": container with ID starting with 8bcebeb35acca5c1fcee8035aaaad8f05cada755d8bee69cf6901d8f3510a2c0 not found: ID does not exist" Apr 17 16:28:35.119751 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.119718 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb"] Apr 17 16:28:35.135269 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:35.135222 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p8fkb"] Apr 17 16:28:36.216808 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:36.216776 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c57e9f8-643a-4563-ad48-d4f2e138a3c8" path="/var/lib/kubelet/pods/5c57e9f8-643a-4563-ad48-d4f2e138a3c8/volumes" Apr 17 16:28:50.331663 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.331621 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn"] Apr 17 16:28:50.332117 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.331986 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c57e9f8-643a-4563-ad48-d4f2e138a3c8" containerName="manager" Apr 17 16:28:50.332117 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.331997 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c57e9f8-643a-4563-ad48-d4f2e138a3c8" containerName="manager" Apr 17 16:28:50.332117 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.332067 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c57e9f8-643a-4563-ad48-d4f2e138a3c8" containerName="manager" Apr 17 16:28:50.338384 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.338353 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.340881 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.340853 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-sffbm\"" Apr 17 16:28:50.347486 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.347457 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn"] Apr 17 16:28:50.395711 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.395674 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr8br\" (UniqueName: \"kubernetes.io/projected/f8651809-7728-4f9e-ac55-a5441cb3d52d-kube-api-access-jr8br\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.395879 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.395716 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.397343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.395836 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f8651809-7728-4f9e-ac55-a5441cb3d52d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.397343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.396303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.397343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.396439 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.397343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.396497 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.397343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.396527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.397343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.396563 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.397343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.396597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f8651809-7728-4f9e-ac55-a5441cb3d52d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497509 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497514 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497758 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497543 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497758 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497758 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497589 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497758 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497758 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497624 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497995 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr8br\" (UniqueName: \"kubernetes.io/projected/f8651809-7728-4f9e-ac55-a5441cb3d52d-kube-api-access-jr8br\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.497995 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.497810 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.498108 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.498042 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.498169 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.498102 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.498218 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.498161 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.498370 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.498348 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.498415 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.498368 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f8651809-7728-4f9e-ac55-a5441cb3d52d-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.499932 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.499903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.500275 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.500257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.505949 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.505926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f8651809-7728-4f9e-ac55-a5441cb3d52d-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.506110 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.506092 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr8br\" (UniqueName: \"kubernetes.io/projected/f8651809-7728-4f9e-ac55-a5441cb3d52d-kube-api-access-jr8br\") pod \"maas-default-gateway-openshift-default-58b6f876-9rngn\" (UID: \"f8651809-7728-4f9e-ac55-a5441cb3d52d\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.653072 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.652980 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:50.783586 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.783561 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn"] Apr 17 16:28:50.786067 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:28:50.786027 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8651809_7728_4f9e_ac55_a5441cb3d52d.slice/crio-9de407dcc5b86c9015b5b35c970e4c6670cb5ee4a6138c47f334d3d3677184c5 WatchSource:0}: Error finding container 9de407dcc5b86c9015b5b35c970e4c6670cb5ee4a6138c47f334d3d3677184c5: Status 404 returned error can't find the container with id 9de407dcc5b86c9015b5b35c970e4c6670cb5ee4a6138c47f334d3d3677184c5 Apr 17 16:28:50.788584 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.788554 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:28:50.788675 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.788616 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:28:50.788675 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:50.788643 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:28:51.160019 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:51.159944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" event={"ID":"f8651809-7728-4f9e-ac55-a5441cb3d52d","Type":"ContainerStarted","Data":"3b247b9ee8daae628edd74b038200238cfbf5de2eee076096b23c5c479787b9b"} Apr 17 16:28:51.160019 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:51.159991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" event={"ID":"f8651809-7728-4f9e-ac55-a5441cb3d52d","Type":"ContainerStarted","Data":"9de407dcc5b86c9015b5b35c970e4c6670cb5ee4a6138c47f334d3d3677184c5"} Apr 17 16:28:51.180398 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:51.180346 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" podStartSLOduration=1.180329006 podStartE2EDuration="1.180329006s" podCreationTimestamp="2026-04-17 16:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:28:51.178009288 +0000 UTC m=+517.558637649" watchObservedRunningTime="2026-04-17 16:28:51.180329006 +0000 UTC m=+517.560957366" Apr 17 16:28:51.653841 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:51.653795 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:51.658569 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:51.658541 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:52.164450 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:52.164420 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:52.165364 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:52.165340 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-9rngn" Apr 17 16:28:55.011949 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.011860 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8f7cl"] Apr 17 16:28:55.015784 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.015767 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:55.018135 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.018112 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-vpm4x\"" Apr 17 16:28:55.018258 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.018115 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 16:28:55.022060 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.022036 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8f7cl"] Apr 17 16:28:55.041243 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.041208 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xb8j\" (UniqueName: \"kubernetes.io/projected/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-kube-api-access-9xb8j\") pod \"limitador-limitador-7d549b5b-8f7cl\" (UID: \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:55.041401 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.041336 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-config-file\") pod \"limitador-limitador-7d549b5b-8f7cl\" (UID: \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:55.114020 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.113988 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8f7cl"] Apr 17 16:28:55.142531 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.142491 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-config-file\") pod \"limitador-limitador-7d549b5b-8f7cl\" (UID: \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:55.142723 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.142618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xb8j\" (UniqueName: \"kubernetes.io/projected/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-kube-api-access-9xb8j\") pod \"limitador-limitador-7d549b5b-8f7cl\" (UID: \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:55.143156 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.143137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-config-file\") pod \"limitador-limitador-7d549b5b-8f7cl\" (UID: \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:55.152879 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.152855 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xb8j\" (UniqueName: \"kubernetes.io/projected/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-kube-api-access-9xb8j\") pod \"limitador-limitador-7d549b5b-8f7cl\" (UID: \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:55.327991 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.327957 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:55.456016 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.455990 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8f7cl"] Apr 17 16:28:55.457845 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:28:55.457795 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ea9fbd1_f3d7_4c4c_b008_817d173e85ad.slice/crio-9e0e0a3491e5c1204867c84cbf750a02d8eae5fc23e3a3b199fbceb8c99ec0a7 WatchSource:0}: Error finding container 9e0e0a3491e5c1204867c84cbf750a02d8eae5fc23e3a3b199fbceb8c99ec0a7: Status 404 returned error can't find the container with id 9e0e0a3491e5c1204867c84cbf750a02d8eae5fc23e3a3b199fbceb8c99ec0a7 Apr 17 16:28:55.819807 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.819777 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-z9pbj"] Apr 17 16:28:55.823680 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.823656 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" Apr 17 16:28:55.827440 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.826668 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q4stg\"" Apr 17 16:28:55.828395 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.828374 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-z9pbj"] Apr 17 16:28:55.847899 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.847871 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pzdb\" (UniqueName: \"kubernetes.io/projected/4df7d761-4ee3-40b3-ac75-40b257633f8c-kube-api-access-2pzdb\") pod \"authorino-f99f4b5cd-z9pbj\" (UID: \"4df7d761-4ee3-40b3-ac75-40b257633f8c\") " pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" Apr 17 16:28:55.949441 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.949412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pzdb\" (UniqueName: \"kubernetes.io/projected/4df7d761-4ee3-40b3-ac75-40b257633f8c-kube-api-access-2pzdb\") pod \"authorino-f99f4b5cd-z9pbj\" (UID: \"4df7d761-4ee3-40b3-ac75-40b257633f8c\") " pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" Apr 17 16:28:55.957995 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:55.957966 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pzdb\" (UniqueName: \"kubernetes.io/projected/4df7d761-4ee3-40b3-ac75-40b257633f8c-kube-api-access-2pzdb\") pod \"authorino-f99f4b5cd-z9pbj\" (UID: \"4df7d761-4ee3-40b3-ac75-40b257633f8c\") " pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" Apr 17 16:28:56.135324 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:56.135246 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" Apr 17 16:28:56.187030 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:56.186938 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" event={"ID":"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad","Type":"ContainerStarted","Data":"9e0e0a3491e5c1204867c84cbf750a02d8eae5fc23e3a3b199fbceb8c99ec0a7"} Apr 17 16:28:56.307245 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:56.307185 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-z9pbj"] Apr 17 16:28:56.314077 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:28:56.314046 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4df7d761_4ee3_40b3_ac75_40b257633f8c.slice/crio-899ef99878be4e4f94e8860d758a233bc1e5cd944fedbb04cf64e2462245762c WatchSource:0}: Error finding container 899ef99878be4e4f94e8860d758a233bc1e5cd944fedbb04cf64e2462245762c: Status 404 returned error can't find the container with id 899ef99878be4e4f94e8860d758a233bc1e5cd944fedbb04cf64e2462245762c Apr 17 16:28:57.192434 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:57.192367 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" event={"ID":"4df7d761-4ee3-40b3-ac75-40b257633f8c","Type":"ContainerStarted","Data":"899ef99878be4e4f94e8860d758a233bc1e5cd944fedbb04cf64e2462245762c"} Apr 17 16:28:59.203343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:59.203302 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" event={"ID":"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad","Type":"ContainerStarted","Data":"0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c"} Apr 17 16:28:59.203738 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:59.203415 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:28:59.221922 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:28:59.221862 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" podStartSLOduration=2.366016925 podStartE2EDuration="5.22184428s" podCreationTimestamp="2026-04-17 16:28:54 +0000 UTC" firstStartedPulling="2026-04-17 16:28:55.459785329 +0000 UTC m=+521.840413668" lastFinishedPulling="2026-04-17 16:28:58.315612668 +0000 UTC m=+524.696241023" observedRunningTime="2026-04-17 16:28:59.217707083 +0000 UTC m=+525.598335445" watchObservedRunningTime="2026-04-17 16:28:59.22184428 +0000 UTC m=+525.602472641" Apr 17 16:29:00.208061 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:00.208019 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" event={"ID":"4df7d761-4ee3-40b3-ac75-40b257633f8c","Type":"ContainerStarted","Data":"f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d"} Apr 17 16:29:00.223339 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:00.223294 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" podStartSLOduration=1.917095351 podStartE2EDuration="5.223280033s" podCreationTimestamp="2026-04-17 16:28:55 +0000 UTC" firstStartedPulling="2026-04-17 16:28:56.316020011 +0000 UTC m=+522.696648362" lastFinishedPulling="2026-04-17 16:28:59.622204706 +0000 UTC m=+526.002833044" observedRunningTime="2026-04-17 16:29:00.221418305 +0000 UTC m=+526.602046665" watchObservedRunningTime="2026-04-17 16:29:00.223280033 +0000 UTC m=+526.603908394" Apr 17 16:29:02.006046 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:02.006012 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-z9pbj"] Apr 17 16:29:02.216609 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:02.216251 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" podUID="4df7d761-4ee3-40b3-ac75-40b257633f8c" containerName="authorino" containerID="cri-o://f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d" gracePeriod=30 Apr 17 16:29:02.457276 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:02.457254 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" Apr 17 16:29:02.512672 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:02.512642 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pzdb\" (UniqueName: \"kubernetes.io/projected/4df7d761-4ee3-40b3-ac75-40b257633f8c-kube-api-access-2pzdb\") pod \"4df7d761-4ee3-40b3-ac75-40b257633f8c\" (UID: \"4df7d761-4ee3-40b3-ac75-40b257633f8c\") " Apr 17 16:29:02.514633 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:02.514602 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df7d761-4ee3-40b3-ac75-40b257633f8c-kube-api-access-2pzdb" (OuterVolumeSpecName: "kube-api-access-2pzdb") pod "4df7d761-4ee3-40b3-ac75-40b257633f8c" (UID: "4df7d761-4ee3-40b3-ac75-40b257633f8c"). InnerVolumeSpecName "kube-api-access-2pzdb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:29:02.614070 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:02.613989 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pzdb\" (UniqueName: \"kubernetes.io/projected/4df7d761-4ee3-40b3-ac75-40b257633f8c-kube-api-access-2pzdb\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:29:03.221449 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.221409 2569 generic.go:358] "Generic (PLEG): container finished" podID="4df7d761-4ee3-40b3-ac75-40b257633f8c" containerID="f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d" exitCode=0 Apr 17 16:29:03.221449 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.221453 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" Apr 17 16:29:03.221962 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.221496 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" event={"ID":"4df7d761-4ee3-40b3-ac75-40b257633f8c","Type":"ContainerDied","Data":"f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d"} Apr 17 16:29:03.221962 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.221529 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-z9pbj" event={"ID":"4df7d761-4ee3-40b3-ac75-40b257633f8c","Type":"ContainerDied","Data":"899ef99878be4e4f94e8860d758a233bc1e5cd944fedbb04cf64e2462245762c"} Apr 17 16:29:03.221962 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.221548 2569 scope.go:117] "RemoveContainer" containerID="f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d" Apr 17 16:29:03.233179 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.233119 2569 scope.go:117] "RemoveContainer" containerID="f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d" Apr 17 16:29:03.233667 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:29:03.233645 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d\": container with ID starting with f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d not found: ID does not exist" containerID="f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d" Apr 17 16:29:03.233756 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.233680 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d"} err="failed to get container status \"f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d\": rpc error: code = NotFound desc = could not find container \"f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d\": container with ID starting with f0e4167f051f488918a82ae5914af7282ddf0524e5fffbc144fc352b455e1f6d not found: ID does not exist" Apr 17 16:29:03.243997 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.243972 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-z9pbj"] Apr 17 16:29:03.246299 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:03.246275 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-z9pbj"] Apr 17 16:29:04.223682 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:04.223645 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4df7d761-4ee3-40b3-ac75-40b257633f8c" path="/var/lib/kubelet/pods/4df7d761-4ee3-40b3-ac75-40b257633f8c/volumes" Apr 17 16:29:10.208960 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:10.208929 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:29:13.010409 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:13.010373 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8f7cl"] Apr 17 16:29:13.010929 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:13.010588 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" podUID="1ea9fbd1-f3d7-4c4c-b008-817d173e85ad" containerName="limitador" containerID="cri-o://0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c" gracePeriod=30 Apr 17 16:29:13.953560 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:13.953536 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:29:14.021753 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.021725 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-config-file\") pod \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\" (UID: \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\") " Apr 17 16:29:14.022165 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.021797 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xb8j\" (UniqueName: \"kubernetes.io/projected/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-kube-api-access-9xb8j\") pod \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\" (UID: \"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad\") " Apr 17 16:29:14.022165 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.022069 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-config-file" (OuterVolumeSpecName: "config-file") pod "1ea9fbd1-f3d7-4c4c-b008-817d173e85ad" (UID: "1ea9fbd1-f3d7-4c4c-b008-817d173e85ad"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:29:14.023766 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.023745 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-kube-api-access-9xb8j" (OuterVolumeSpecName: "kube-api-access-9xb8j") pod "1ea9fbd1-f3d7-4c4c-b008-817d173e85ad" (UID: "1ea9fbd1-f3d7-4c4c-b008-817d173e85ad"). InnerVolumeSpecName "kube-api-access-9xb8j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:29:14.123023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.122940 2569 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-config-file\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:29:14.123023 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.122971 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9xb8j\" (UniqueName: \"kubernetes.io/projected/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad-kube-api-access-9xb8j\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:29:14.266008 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.265967 2569 generic.go:358] "Generic (PLEG): container finished" podID="1ea9fbd1-f3d7-4c4c-b008-817d173e85ad" containerID="0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c" exitCode=0 Apr 17 16:29:14.266175 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.266048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" event={"ID":"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad","Type":"ContainerDied","Data":"0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c"} Apr 17 16:29:14.266175 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.266085 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" event={"ID":"1ea9fbd1-f3d7-4c4c-b008-817d173e85ad","Type":"ContainerDied","Data":"9e0e0a3491e5c1204867c84cbf750a02d8eae5fc23e3a3b199fbceb8c99ec0a7"} Apr 17 16:29:14.266175 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.266104 2569 scope.go:117] "RemoveContainer" containerID="0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c" Apr 17 16:29:14.266175 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.266055 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-8f7cl" Apr 17 16:29:14.274293 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.274275 2569 scope.go:117] "RemoveContainer" containerID="0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c" Apr 17 16:29:14.274542 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:29:14.274520 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c\": container with ID starting with 0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c not found: ID does not exist" containerID="0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c" Apr 17 16:29:14.274634 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.274546 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c"} err="failed to get container status \"0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c\": rpc error: code = NotFound desc = could not find container \"0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c\": container with ID starting with 0c0049f14f1e85374ef1cf89c4f62408890c9c92feaaeec665fe17065c751c4c not found: ID does not exist" Apr 17 16:29:14.283005 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.282981 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8f7cl"] Apr 17 16:29:14.287925 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:14.287897 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-8f7cl"] Apr 17 16:29:16.218637 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.218603 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ea9fbd1-f3d7-4c4c-b008-817d173e85ad" path="/var/lib/kubelet/pods/1ea9fbd1-f3d7-4c4c-b008-817d173e85ad/volumes" Apr 17 16:29:16.219112 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.218878 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-swrsx"] Apr 17 16:29:16.219214 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.219198 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ea9fbd1-f3d7-4c4c-b008-817d173e85ad" containerName="limitador" Apr 17 16:29:16.219305 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.219215 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea9fbd1-f3d7-4c4c-b008-817d173e85ad" containerName="limitador" Apr 17 16:29:16.219305 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.219222 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4df7d761-4ee3-40b3-ac75-40b257633f8c" containerName="authorino" Apr 17 16:29:16.219305 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.219252 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df7d761-4ee3-40b3-ac75-40b257633f8c" containerName="authorino" Apr 17 16:29:16.219462 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.219316 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4df7d761-4ee3-40b3-ac75-40b257633f8c" containerName="authorino" Apr 17 16:29:16.219462 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.219329 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ea9fbd1-f3d7-4c4c-b008-817d173e85ad" containerName="limitador" Apr 17 16:29:16.224282 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.224259 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:16.227374 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.227350 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-5mhnj\"" Apr 17 16:29:16.227542 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.227527 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 16:29:16.232755 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.232472 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-swrsx"] Apr 17 16:29:16.342291 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.342255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c5474b0d-7a8c-48cf-bb20-e3d94df6c617-data\") pod \"postgres-868db5846d-swrsx\" (UID: \"c5474b0d-7a8c-48cf-bb20-e3d94df6c617\") " pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:16.342512 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.342343 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5g6w\" (UniqueName: \"kubernetes.io/projected/c5474b0d-7a8c-48cf-bb20-e3d94df6c617-kube-api-access-k5g6w\") pod \"postgres-868db5846d-swrsx\" (UID: \"c5474b0d-7a8c-48cf-bb20-e3d94df6c617\") " pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:16.443379 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.443341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c5474b0d-7a8c-48cf-bb20-e3d94df6c617-data\") pod \"postgres-868db5846d-swrsx\" (UID: \"c5474b0d-7a8c-48cf-bb20-e3d94df6c617\") " pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:16.443556 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.443391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5g6w\" (UniqueName: \"kubernetes.io/projected/c5474b0d-7a8c-48cf-bb20-e3d94df6c617-kube-api-access-k5g6w\") pod \"postgres-868db5846d-swrsx\" (UID: \"c5474b0d-7a8c-48cf-bb20-e3d94df6c617\") " pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:16.443781 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.443758 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/c5474b0d-7a8c-48cf-bb20-e3d94df6c617-data\") pod \"postgres-868db5846d-swrsx\" (UID: \"c5474b0d-7a8c-48cf-bb20-e3d94df6c617\") " pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:16.453980 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.453956 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5g6w\" (UniqueName: \"kubernetes.io/projected/c5474b0d-7a8c-48cf-bb20-e3d94df6c617-kube-api-access-k5g6w\") pod \"postgres-868db5846d-swrsx\" (UID: \"c5474b0d-7a8c-48cf-bb20-e3d94df6c617\") " pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:16.537489 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.537386 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:16.676979 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:16.676944 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-swrsx"] Apr 17 16:29:16.678013 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:29:16.677983 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5474b0d_7a8c_48cf_bb20_e3d94df6c617.slice/crio-e54a8a8293b28e8d9257bfd91e3d42d8dce9a1bd3f4288f4a657b86c8ae31bd1 WatchSource:0}: Error finding container e54a8a8293b28e8d9257bfd91e3d42d8dce9a1bd3f4288f4a657b86c8ae31bd1: Status 404 returned error can't find the container with id e54a8a8293b28e8d9257bfd91e3d42d8dce9a1bd3f4288f4a657b86c8ae31bd1 Apr 17 16:29:17.279094 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:17.279055 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-swrsx" event={"ID":"c5474b0d-7a8c-48cf-bb20-e3d94df6c617","Type":"ContainerStarted","Data":"e54a8a8293b28e8d9257bfd91e3d42d8dce9a1bd3f4288f4a657b86c8ae31bd1"} Apr 17 16:29:22.305212 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:22.305179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-swrsx" event={"ID":"c5474b0d-7a8c-48cf-bb20-e3d94df6c617","Type":"ContainerStarted","Data":"47f709e9d77fda9cae23271f3e268c95b1ee17cea4f0d4b4b88aa8e2de7e84c1"} Apr 17 16:29:22.305589 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:22.305271 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:22.319956 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:22.319907 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-swrsx" podStartSLOduration=1.376508918 podStartE2EDuration="6.319893207s" podCreationTimestamp="2026-04-17 16:29:16 +0000 UTC" firstStartedPulling="2026-04-17 16:29:16.679440953 +0000 UTC m=+543.060069298" lastFinishedPulling="2026-04-17 16:29:21.622825244 +0000 UTC m=+548.003453587" observedRunningTime="2026-04-17 16:29:22.318801892 +0000 UTC m=+548.699430254" watchObservedRunningTime="2026-04-17 16:29:22.319893207 +0000 UTC m=+548.700521598" Apr 17 16:29:28.338370 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:28.338335 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-swrsx" Apr 17 16:29:28.863389 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:28.863358 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vmh6f"] Apr 17 16:29:28.871652 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:28.871626 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-vmh6f" Apr 17 16:29:28.872700 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:28.872673 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vmh6f"] Apr 17 16:29:28.874001 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:28.873980 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q4stg\"" Apr 17 16:29:29.062615 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.062581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtm9n\" (UniqueName: \"kubernetes.io/projected/c96c7aaa-3b14-43a6-a8fb-3b69a99c6574-kube-api-access-wtm9n\") pod \"authorino-8b475cf9f-vmh6f\" (UID: \"c96c7aaa-3b14-43a6-a8fb-3b69a99c6574\") " pod="kuadrant-system/authorino-8b475cf9f-vmh6f" Apr 17 16:29:29.113908 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.113820 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vmh6f"] Apr 17 16:29:29.114108 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:29:29.114090 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-wtm9n], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-vmh6f" podUID="c96c7aaa-3b14-43a6-a8fb-3b69a99c6574" Apr 17 16:29:29.142215 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.142179 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7f6f748cc-zppx8"] Apr 17 16:29:29.163403 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.163368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtm9n\" (UniqueName: \"kubernetes.io/projected/c96c7aaa-3b14-43a6-a8fb-3b69a99c6574-kube-api-access-wtm9n\") pod \"authorino-8b475cf9f-vmh6f\" (UID: \"c96c7aaa-3b14-43a6-a8fb-3b69a99c6574\") " pod="kuadrant-system/authorino-8b475cf9f-vmh6f" Apr 17 16:29:29.170711 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.170680 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtm9n\" (UniqueName: \"kubernetes.io/projected/c96c7aaa-3b14-43a6-a8fb-3b69a99c6574-kube-api-access-wtm9n\") pod \"authorino-8b475cf9f-vmh6f\" (UID: \"c96c7aaa-3b14-43a6-a8fb-3b69a99c6574\") " pod="kuadrant-system/authorino-8b475cf9f-vmh6f" Apr 17 16:29:29.205064 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.205030 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7f6f748cc-zppx8"] Apr 17 16:29:29.205222 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.205159 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:29:29.207809 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.207774 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 16:29:29.332447 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.332415 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-vmh6f" Apr 17 16:29:29.337250 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.337206 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-vmh6f" Apr 17 16:29:29.365133 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.365072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3ee2df98-88cb-4cac-80e7-204599e8265c-tls-cert\") pod \"authorino-7f6f748cc-zppx8\" (UID: \"3ee2df98-88cb-4cac-80e7-204599e8265c\") " pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:29:29.365466 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.365190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74mg8\" (UniqueName: \"kubernetes.io/projected/3ee2df98-88cb-4cac-80e7-204599e8265c-kube-api-access-74mg8\") pod \"authorino-7f6f748cc-zppx8\" (UID: \"3ee2df98-88cb-4cac-80e7-204599e8265c\") " pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:29:29.466382 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.466348 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtm9n\" (UniqueName: \"kubernetes.io/projected/c96c7aaa-3b14-43a6-a8fb-3b69a99c6574-kube-api-access-wtm9n\") pod \"c96c7aaa-3b14-43a6-a8fb-3b69a99c6574\" (UID: \"c96c7aaa-3b14-43a6-a8fb-3b69a99c6574\") " Apr 17 16:29:29.466535 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.466522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74mg8\" (UniqueName: \"kubernetes.io/projected/3ee2df98-88cb-4cac-80e7-204599e8265c-kube-api-access-74mg8\") pod \"authorino-7f6f748cc-zppx8\" (UID: \"3ee2df98-88cb-4cac-80e7-204599e8265c\") " pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:29:29.466576 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.466553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3ee2df98-88cb-4cac-80e7-204599e8265c-tls-cert\") pod \"authorino-7f6f748cc-zppx8\" (UID: \"3ee2df98-88cb-4cac-80e7-204599e8265c\") " pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:29:29.468504 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.468475 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96c7aaa-3b14-43a6-a8fb-3b69a99c6574-kube-api-access-wtm9n" (OuterVolumeSpecName: "kube-api-access-wtm9n") pod "c96c7aaa-3b14-43a6-a8fb-3b69a99c6574" (UID: "c96c7aaa-3b14-43a6-a8fb-3b69a99c6574"). InnerVolumeSpecName "kube-api-access-wtm9n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:29:29.469059 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.469039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3ee2df98-88cb-4cac-80e7-204599e8265c-tls-cert\") pod \"authorino-7f6f748cc-zppx8\" (UID: \"3ee2df98-88cb-4cac-80e7-204599e8265c\") " pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:29:29.473933 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.473905 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74mg8\" (UniqueName: \"kubernetes.io/projected/3ee2df98-88cb-4cac-80e7-204599e8265c-kube-api-access-74mg8\") pod \"authorino-7f6f748cc-zppx8\" (UID: \"3ee2df98-88cb-4cac-80e7-204599e8265c\") " pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:29:29.514640 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.514614 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:29:29.567289 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.567252 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wtm9n\" (UniqueName: \"kubernetes.io/projected/c96c7aaa-3b14-43a6-a8fb-3b69a99c6574-kube-api-access-wtm9n\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:29:29.638742 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:29.638716 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7f6f748cc-zppx8"] Apr 17 16:29:29.640212 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:29:29.640183 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ee2df98_88cb_4cac_80e7_204599e8265c.slice/crio-844594899c26d6acdc4a23d183b585c254da1c474d856987f38ebd7ffc44e165 WatchSource:0}: Error finding container 844594899c26d6acdc4a23d183b585c254da1c474d856987f38ebd7ffc44e165: Status 404 returned error can't find the container with id 844594899c26d6acdc4a23d183b585c254da1c474d856987f38ebd7ffc44e165 Apr 17 16:29:30.337496 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:30.337456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7f6f748cc-zppx8" event={"ID":"3ee2df98-88cb-4cac-80e7-204599e8265c","Type":"ContainerStarted","Data":"d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5"} Apr 17 16:29:30.337496 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:30.337501 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7f6f748cc-zppx8" event={"ID":"3ee2df98-88cb-4cac-80e7-204599e8265c","Type":"ContainerStarted","Data":"844594899c26d6acdc4a23d183b585c254da1c474d856987f38ebd7ffc44e165"} Apr 17 16:29:30.337737 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:30.337619 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-vmh6f" Apr 17 16:29:30.356396 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:30.356349 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7f6f748cc-zppx8" podStartSLOduration=0.882157513 podStartE2EDuration="1.356332887s" podCreationTimestamp="2026-04-17 16:29:29 +0000 UTC" firstStartedPulling="2026-04-17 16:29:29.641536425 +0000 UTC m=+556.022164767" lastFinishedPulling="2026-04-17 16:29:30.115711796 +0000 UTC m=+556.496340141" observedRunningTime="2026-04-17 16:29:30.352866951 +0000 UTC m=+556.733495314" watchObservedRunningTime="2026-04-17 16:29:30.356332887 +0000 UTC m=+556.736961247" Apr 17 16:29:30.386568 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:30.386475 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vmh6f"] Apr 17 16:29:30.393559 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:30.393530 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-vmh6f"] Apr 17 16:29:31.279684 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.279638 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-qmls6"] Apr 17 16:29:31.329891 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.329861 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-qmls6"] Apr 17 16:29:31.330073 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.330004 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" Apr 17 16:29:31.332557 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.332536 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-z7k5r\"" Apr 17 16:29:31.386851 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.386809 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56cg\" (UniqueName: \"kubernetes.io/projected/2208711a-e000-4747-899f-b8cdf1e4acfd-kube-api-access-d56cg\") pod \"maas-controller-6d4c8f55f9-qmls6\" (UID: \"2208711a-e000-4747-899f-b8cdf1e4acfd\") " pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" Apr 17 16:29:31.425194 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.425163 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7dbff48c98-4xvtn"] Apr 17 16:29:31.451918 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.451887 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7dbff48c98-4xvtn"] Apr 17 16:29:31.452049 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.451995 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:31.488058 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.488031 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9jsf\" (UniqueName: \"kubernetes.io/projected/0b40210d-dd6b-425e-acf1-92ecc432a9be-kube-api-access-r9jsf\") pod \"maas-controller-7dbff48c98-4xvtn\" (UID: \"0b40210d-dd6b-425e-acf1-92ecc432a9be\") " pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:31.488209 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.488110 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d56cg\" (UniqueName: \"kubernetes.io/projected/2208711a-e000-4747-899f-b8cdf1e4acfd-kube-api-access-d56cg\") pod \"maas-controller-6d4c8f55f9-qmls6\" (UID: \"2208711a-e000-4747-899f-b8cdf1e4acfd\") " pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" Apr 17 16:29:31.496211 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.496177 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56cg\" (UniqueName: \"kubernetes.io/projected/2208711a-e000-4747-899f-b8cdf1e4acfd-kube-api-access-d56cg\") pod \"maas-controller-6d4c8f55f9-qmls6\" (UID: \"2208711a-e000-4747-899f-b8cdf1e4acfd\") " pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" Apr 17 16:29:31.546984 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.546903 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-qmls6"] Apr 17 16:29:31.547205 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.547193 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" Apr 17 16:29:31.589554 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.589508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9jsf\" (UniqueName: \"kubernetes.io/projected/0b40210d-dd6b-425e-acf1-92ecc432a9be-kube-api-access-r9jsf\") pod \"maas-controller-7dbff48c98-4xvtn\" (UID: \"0b40210d-dd6b-425e-acf1-92ecc432a9be\") " pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:31.598543 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.598514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9jsf\" (UniqueName: \"kubernetes.io/projected/0b40210d-dd6b-425e-acf1-92ecc432a9be-kube-api-access-r9jsf\") pod \"maas-controller-7dbff48c98-4xvtn\" (UID: \"0b40210d-dd6b-425e-acf1-92ecc432a9be\") " pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:31.677207 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.677176 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-qmls6"] Apr 17 16:29:31.678922 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:29:31.678894 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2208711a_e000_4747_899f_b8cdf1e4acfd.slice/crio-fe146e109139e46e2c74e9aa3b986db146bb67efd6623689df6577e1100b72d1 WatchSource:0}: Error finding container fe146e109139e46e2c74e9aa3b986db146bb67efd6623689df6577e1100b72d1: Status 404 returned error can't find the container with id fe146e109139e46e2c74e9aa3b986db146bb67efd6623689df6577e1100b72d1 Apr 17 16:29:31.761814 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.761774 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:31.886303 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:31.886274 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7dbff48c98-4xvtn"] Apr 17 16:29:31.886975 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:29:31.886947 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b40210d_dd6b_425e_acf1_92ecc432a9be.slice/crio-e1d46df62db53421e12989ca4052f498b10987b579e21238dd4b47ccfbd55ab8 WatchSource:0}: Error finding container e1d46df62db53421e12989ca4052f498b10987b579e21238dd4b47ccfbd55ab8: Status 404 returned error can't find the container with id e1d46df62db53421e12989ca4052f498b10987b579e21238dd4b47ccfbd55ab8 Apr 17 16:29:32.218360 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:32.218323 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c96c7aaa-3b14-43a6-a8fb-3b69a99c6574" path="/var/lib/kubelet/pods/c96c7aaa-3b14-43a6-a8fb-3b69a99c6574/volumes" Apr 17 16:29:32.349325 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:32.349286 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" event={"ID":"0b40210d-dd6b-425e-acf1-92ecc432a9be","Type":"ContainerStarted","Data":"e1d46df62db53421e12989ca4052f498b10987b579e21238dd4b47ccfbd55ab8"} Apr 17 16:29:32.350843 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:32.350809 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" event={"ID":"2208711a-e000-4747-899f-b8cdf1e4acfd","Type":"ContainerStarted","Data":"fe146e109139e46e2c74e9aa3b986db146bb67efd6623689df6577e1100b72d1"} Apr 17 16:29:34.361402 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:34.361359 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" event={"ID":"2208711a-e000-4747-899f-b8cdf1e4acfd","Type":"ContainerStarted","Data":"8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd"} Apr 17 16:29:34.361850 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:34.361437 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" Apr 17 16:29:34.361850 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:34.361458 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" podUID="2208711a-e000-4747-899f-b8cdf1e4acfd" containerName="manager" containerID="cri-o://8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd" gracePeriod=10 Apr 17 16:29:34.379810 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:34.379753 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" podStartSLOduration=0.820384727 podStartE2EDuration="3.379733119s" podCreationTimestamp="2026-04-17 16:29:31 +0000 UTC" firstStartedPulling="2026-04-17 16:29:31.680851846 +0000 UTC m=+558.061480185" lastFinishedPulling="2026-04-17 16:29:34.240200239 +0000 UTC m=+560.620828577" observedRunningTime="2026-04-17 16:29:34.375017484 +0000 UTC m=+560.755645841" watchObservedRunningTime="2026-04-17 16:29:34.379733119 +0000 UTC m=+560.760361482" Apr 17 16:29:34.909510 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:34.909463 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" Apr 17 16:29:35.021888 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.021782 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d56cg\" (UniqueName: \"kubernetes.io/projected/2208711a-e000-4747-899f-b8cdf1e4acfd-kube-api-access-d56cg\") pod \"2208711a-e000-4747-899f-b8cdf1e4acfd\" (UID: \"2208711a-e000-4747-899f-b8cdf1e4acfd\") " Apr 17 16:29:35.023927 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.023904 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2208711a-e000-4747-899f-b8cdf1e4acfd-kube-api-access-d56cg" (OuterVolumeSpecName: "kube-api-access-d56cg") pod "2208711a-e000-4747-899f-b8cdf1e4acfd" (UID: "2208711a-e000-4747-899f-b8cdf1e4acfd"). InnerVolumeSpecName "kube-api-access-d56cg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:29:35.122591 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.122551 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d56cg\" (UniqueName: \"kubernetes.io/projected/2208711a-e000-4747-899f-b8cdf1e4acfd-kube-api-access-d56cg\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:29:35.366872 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.366833 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" event={"ID":"0b40210d-dd6b-425e-acf1-92ecc432a9be","Type":"ContainerStarted","Data":"eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd"} Apr 17 16:29:35.367312 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.366907 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:35.368020 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.367994 2569 generic.go:358] "Generic (PLEG): container finished" podID="2208711a-e000-4747-899f-b8cdf1e4acfd" containerID="8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd" exitCode=2 Apr 17 16:29:35.368130 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.368041 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" Apr 17 16:29:35.368130 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.368075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" event={"ID":"2208711a-e000-4747-899f-b8cdf1e4acfd","Type":"ContainerDied","Data":"8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd"} Apr 17 16:29:35.368130 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.368105 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-qmls6" event={"ID":"2208711a-e000-4747-899f-b8cdf1e4acfd","Type":"ContainerDied","Data":"fe146e109139e46e2c74e9aa3b986db146bb67efd6623689df6577e1100b72d1"} Apr 17 16:29:35.368130 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.368120 2569 scope.go:117] "RemoveContainer" containerID="8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd" Apr 17 16:29:35.376954 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.376939 2569 scope.go:117] "RemoveContainer" containerID="8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd" Apr 17 16:29:35.377189 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:29:35.377172 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd\": container with ID starting with 8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd not found: ID does not exist" containerID="8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd" Apr 17 16:29:35.377266 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.377197 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd"} err="failed to get container status \"8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd\": rpc error: code = NotFound desc = could not find container \"8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd\": container with ID starting with 8ac4c0c41e372003ed59054e69e955173bf91cedb5cef374b6445bb845af11bd not found: ID does not exist" Apr 17 16:29:35.384401 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.384365 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" podStartSLOduration=1.4566154660000001 podStartE2EDuration="4.384352668s" podCreationTimestamp="2026-04-17 16:29:31 +0000 UTC" firstStartedPulling="2026-04-17 16:29:31.888353867 +0000 UTC m=+558.268982207" lastFinishedPulling="2026-04-17 16:29:34.816091068 +0000 UTC m=+561.196719409" observedRunningTime="2026-04-17 16:29:35.381530229 +0000 UTC m=+561.762158590" watchObservedRunningTime="2026-04-17 16:29:35.384352668 +0000 UTC m=+561.764981042" Apr 17 16:29:35.394465 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.394443 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-qmls6"] Apr 17 16:29:35.396632 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:35.396613 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-qmls6"] Apr 17 16:29:36.218197 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.218165 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2208711a-e000-4747-899f-b8cdf1e4acfd" path="/var/lib/kubelet/pods/2208711a-e000-4747-899f-b8cdf1e4acfd/volumes" Apr 17 16:29:36.814293 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.814256 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-557764dc6-8zw85"] Apr 17 16:29:36.814677 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.814669 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2208711a-e000-4747-899f-b8cdf1e4acfd" containerName="manager" Apr 17 16:29:36.814716 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.814680 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2208711a-e000-4747-899f-b8cdf1e4acfd" containerName="manager" Apr 17 16:29:36.814757 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.814748 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2208711a-e000-4747-899f-b8cdf1e4acfd" containerName="manager" Apr 17 16:29:36.819190 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.819171 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:36.821903 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.821876 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-ps47p\"" Apr 17 16:29:36.822039 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.821876 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 16:29:36.822039 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.821879 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 16:29:36.828490 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.828463 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-557764dc6-8zw85"] Apr 17 16:29:36.936265 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.936190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-664rs\" (UniqueName: \"kubernetes.io/projected/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-kube-api-access-664rs\") pod \"maas-api-557764dc6-8zw85\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:36.936455 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:36.936339 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls\") pod \"maas-api-557764dc6-8zw85\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:37.037245 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:37.037196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls\") pod \"maas-api-557764dc6-8zw85\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:37.037426 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:37.037339 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-664rs\" (UniqueName: \"kubernetes.io/projected/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-kube-api-access-664rs\") pod \"maas-api-557764dc6-8zw85\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:37.037426 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:29:37.037359 2569 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 17 16:29:37.037513 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:29:37.037438 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls podName:c00dbd28-ba8a-49bb-ab1a-a752fc72c738 nodeName:}" failed. No retries permitted until 2026-04-17 16:29:37.537419288 +0000 UTC m=+563.918047634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls") pod "maas-api-557764dc6-8zw85" (UID: "c00dbd28-ba8a-49bb-ab1a-a752fc72c738") : secret "maas-api-serving-cert" not found Apr 17 16:29:37.046648 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:37.046622 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-664rs\" (UniqueName: \"kubernetes.io/projected/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-kube-api-access-664rs\") pod \"maas-api-557764dc6-8zw85\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:37.541361 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:37.541327 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls\") pod \"maas-api-557764dc6-8zw85\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:37.543830 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:37.543808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls\") pod \"maas-api-557764dc6-8zw85\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:37.731531 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:37.731496 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:37.866710 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:37.866684 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-557764dc6-8zw85"] Apr 17 16:29:37.870312 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:29:37.870278 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc00dbd28_ba8a_49bb_ab1a_a752fc72c738.slice/crio-481a064f84750d77f0b0cc8bd7afce7d58b845abd7ead95e91b56b19cb776177 WatchSource:0}: Error finding container 481a064f84750d77f0b0cc8bd7afce7d58b845abd7ead95e91b56b19cb776177: Status 404 returned error can't find the container with id 481a064f84750d77f0b0cc8bd7afce7d58b845abd7ead95e91b56b19cb776177 Apr 17 16:29:38.382364 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:38.382328 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-557764dc6-8zw85" event={"ID":"c00dbd28-ba8a-49bb-ab1a-a752fc72c738","Type":"ContainerStarted","Data":"481a064f84750d77f0b0cc8bd7afce7d58b845abd7ead95e91b56b19cb776177"} Apr 17 16:29:40.392167 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:40.392126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-557764dc6-8zw85" event={"ID":"c00dbd28-ba8a-49bb-ab1a-a752fc72c738","Type":"ContainerStarted","Data":"b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a"} Apr 17 16:29:40.392648 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:40.392251 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:40.409036 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:40.408989 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-557764dc6-8zw85" podStartSLOduration=2.814167499 podStartE2EDuration="4.408974447s" podCreationTimestamp="2026-04-17 16:29:36 +0000 UTC" firstStartedPulling="2026-04-17 16:29:37.871996585 +0000 UTC m=+564.252624928" lastFinishedPulling="2026-04-17 16:29:39.466803536 +0000 UTC m=+565.847431876" observedRunningTime="2026-04-17 16:29:40.406620989 +0000 UTC m=+566.787249365" watchObservedRunningTime="2026-04-17 16:29:40.408974447 +0000 UTC m=+566.789602822" Apr 17 16:29:46.047471 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.047431 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7dbff48c98-4xvtn"] Apr 17 16:29:46.047897 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.047662 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" podUID="0b40210d-dd6b-425e-acf1-92ecc432a9be" containerName="manager" containerID="cri-o://eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd" gracePeriod=10 Apr 17 16:29:46.051723 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.051691 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:46.287388 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.287362 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:46.325968 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.325930 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9jsf\" (UniqueName: \"kubernetes.io/projected/0b40210d-dd6b-425e-acf1-92ecc432a9be-kube-api-access-r9jsf\") pod \"0b40210d-dd6b-425e-acf1-92ecc432a9be\" (UID: \"0b40210d-dd6b-425e-acf1-92ecc432a9be\") " Apr 17 16:29:46.327946 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.327921 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5755dd7cbb-cln2h"] Apr 17 16:29:46.328363 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.328348 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b40210d-dd6b-425e-acf1-92ecc432a9be" containerName="manager" Apr 17 16:29:46.328363 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.328364 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b40210d-dd6b-425e-acf1-92ecc432a9be" containerName="manager" Apr 17 16:29:46.328496 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.328431 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b40210d-dd6b-425e-acf1-92ecc432a9be" containerName="manager" Apr 17 16:29:46.328624 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.328602 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b40210d-dd6b-425e-acf1-92ecc432a9be-kube-api-access-r9jsf" (OuterVolumeSpecName: "kube-api-access-r9jsf") pod "0b40210d-dd6b-425e-acf1-92ecc432a9be" (UID: "0b40210d-dd6b-425e-acf1-92ecc432a9be"). InnerVolumeSpecName "kube-api-access-r9jsf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:29:46.336948 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.336925 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5755dd7cbb-cln2h" Apr 17 16:29:46.340029 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.340006 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5755dd7cbb-cln2h"] Apr 17 16:29:46.402368 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.402343 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:29:46.420891 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.420860 2569 generic.go:358] "Generic (PLEG): container finished" podID="0b40210d-dd6b-425e-acf1-92ecc432a9be" containerID="eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd" exitCode=0 Apr 17 16:29:46.421072 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.420907 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" event={"ID":"0b40210d-dd6b-425e-acf1-92ecc432a9be","Type":"ContainerDied","Data":"eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd"} Apr 17 16:29:46.421072 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.420924 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" Apr 17 16:29:46.421072 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.420937 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dbff48c98-4xvtn" event={"ID":"0b40210d-dd6b-425e-acf1-92ecc432a9be","Type":"ContainerDied","Data":"e1d46df62db53421e12989ca4052f498b10987b579e21238dd4b47ccfbd55ab8"} Apr 17 16:29:46.421072 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.420960 2569 scope.go:117] "RemoveContainer" containerID="eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd" Apr 17 16:29:46.427542 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.427409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94lt9\" (UniqueName: \"kubernetes.io/projected/a7adc5d2-9237-42d8-b35d-0f667707cd3a-kube-api-access-94lt9\") pod \"maas-controller-5755dd7cbb-cln2h\" (UID: \"a7adc5d2-9237-42d8-b35d-0f667707cd3a\") " pod="opendatahub/maas-controller-5755dd7cbb-cln2h" Apr 17 16:29:46.427542 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.427466 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r9jsf\" (UniqueName: \"kubernetes.io/projected/0b40210d-dd6b-425e-acf1-92ecc432a9be-kube-api-access-r9jsf\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:29:46.433343 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.433326 2569 scope.go:117] "RemoveContainer" containerID="eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd" Apr 17 16:29:46.433624 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:29:46.433603 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd\": container with ID starting with eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd not found: ID does not exist" containerID="eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd" Apr 17 16:29:46.433729 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.433631 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd"} err="failed to get container status \"eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd\": rpc error: code = NotFound desc = could not find container \"eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd\": container with ID starting with eac84c26dc5ed8f69f885c526b29e7a370d6c3a13b0bc44da32fa906dad4e0fd not found: ID does not exist" Apr 17 16:29:46.448176 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.448152 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7dbff48c98-4xvtn"] Apr 17 16:29:46.451502 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.451479 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7dbff48c98-4xvtn"] Apr 17 16:29:46.528041 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.528003 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94lt9\" (UniqueName: \"kubernetes.io/projected/a7adc5d2-9237-42d8-b35d-0f667707cd3a-kube-api-access-94lt9\") pod \"maas-controller-5755dd7cbb-cln2h\" (UID: \"a7adc5d2-9237-42d8-b35d-0f667707cd3a\") " pod="opendatahub/maas-controller-5755dd7cbb-cln2h" Apr 17 16:29:46.535986 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.535960 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94lt9\" (UniqueName: \"kubernetes.io/projected/a7adc5d2-9237-42d8-b35d-0f667707cd3a-kube-api-access-94lt9\") pod \"maas-controller-5755dd7cbb-cln2h\" (UID: \"a7adc5d2-9237-42d8-b35d-0f667707cd3a\") " pod="opendatahub/maas-controller-5755dd7cbb-cln2h" Apr 17 16:29:46.648289 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.648156 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5755dd7cbb-cln2h" Apr 17 16:29:46.776223 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:29:46.776181 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7adc5d2_9237_42d8_b35d_0f667707cd3a.slice/crio-2ae129e99906c1b10222e8a3f82d42af4d614baa267414ec0ffa09265c8e0427 WatchSource:0}: Error finding container 2ae129e99906c1b10222e8a3f82d42af4d614baa267414ec0ffa09265c8e0427: Status 404 returned error can't find the container with id 2ae129e99906c1b10222e8a3f82d42af4d614baa267414ec0ffa09265c8e0427 Apr 17 16:29:46.776396 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:46.776370 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5755dd7cbb-cln2h"] Apr 17 16:29:47.426565 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:47.426419 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5755dd7cbb-cln2h" event={"ID":"a7adc5d2-9237-42d8-b35d-0f667707cd3a","Type":"ContainerStarted","Data":"095259d4229eca63edc8ceb4c01c2bd988e4de52367a8acffd38b6ef3bdc1c61"} Apr 17 16:29:47.426565 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:47.426472 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5755dd7cbb-cln2h" event={"ID":"a7adc5d2-9237-42d8-b35d-0f667707cd3a","Type":"ContainerStarted","Data":"2ae129e99906c1b10222e8a3f82d42af4d614baa267414ec0ffa09265c8e0427"} Apr 17 16:29:47.426565 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:47.426496 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5755dd7cbb-cln2h" Apr 17 16:29:47.445009 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:47.444943 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5755dd7cbb-cln2h" podStartSLOduration=1.136708637 podStartE2EDuration="1.444928294s" podCreationTimestamp="2026-04-17 16:29:46 +0000 UTC" firstStartedPulling="2026-04-17 16:29:46.77764361 +0000 UTC m=+573.158271950" lastFinishedPulling="2026-04-17 16:29:47.085863265 +0000 UTC m=+573.466491607" observedRunningTime="2026-04-17 16:29:47.442268916 +0000 UTC m=+573.822897278" watchObservedRunningTime="2026-04-17 16:29:47.444928294 +0000 UTC m=+573.825556655" Apr 17 16:29:48.216740 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:48.216709 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b40210d-dd6b-425e-acf1-92ecc432a9be" path="/var/lib/kubelet/pods/0b40210d-dd6b-425e-acf1-92ecc432a9be/volumes" Apr 17 16:29:58.436927 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:29:58.436896 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5755dd7cbb-cln2h" Apr 17 16:30:17.488129 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.488088 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-55f9d656c8-xdws9"] Apr 17 16:30:17.491765 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.491740 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:17.498640 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.498615 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-55f9d656c8-xdws9"] Apr 17 16:30:17.603354 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.603315 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/72026730-e31b-4bb0-9c07-3932db13003d-maas-api-tls\") pod \"maas-api-55f9d656c8-xdws9\" (UID: \"72026730-e31b-4bb0-9c07-3932db13003d\") " pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:17.603354 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.603364 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7dg\" (UniqueName: \"kubernetes.io/projected/72026730-e31b-4bb0-9c07-3932db13003d-kube-api-access-6z7dg\") pod \"maas-api-55f9d656c8-xdws9\" (UID: \"72026730-e31b-4bb0-9c07-3932db13003d\") " pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:17.704021 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.703984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/72026730-e31b-4bb0-9c07-3932db13003d-maas-api-tls\") pod \"maas-api-55f9d656c8-xdws9\" (UID: \"72026730-e31b-4bb0-9c07-3932db13003d\") " pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:17.704216 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.704035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7dg\" (UniqueName: \"kubernetes.io/projected/72026730-e31b-4bb0-9c07-3932db13003d-kube-api-access-6z7dg\") pod \"maas-api-55f9d656c8-xdws9\" (UID: \"72026730-e31b-4bb0-9c07-3932db13003d\") " pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:17.706527 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.706500 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/72026730-e31b-4bb0-9c07-3932db13003d-maas-api-tls\") pod \"maas-api-55f9d656c8-xdws9\" (UID: \"72026730-e31b-4bb0-9c07-3932db13003d\") " pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:17.712193 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.712153 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7dg\" (UniqueName: \"kubernetes.io/projected/72026730-e31b-4bb0-9c07-3932db13003d-kube-api-access-6z7dg\") pod \"maas-api-55f9d656c8-xdws9\" (UID: \"72026730-e31b-4bb0-9c07-3932db13003d\") " pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:17.805130 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.805032 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:17.951115 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.951082 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-55f9d656c8-xdws9"] Apr 17 16:30:17.953384 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:30:17.953354 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72026730_e31b_4bb0_9c07_3932db13003d.slice/crio-510bb7c1630ebe0bc69bdc66000cae26231616188da567a4f7b009bc71ee6daa WatchSource:0}: Error finding container 510bb7c1630ebe0bc69bdc66000cae26231616188da567a4f7b009bc71ee6daa: Status 404 returned error can't find the container with id 510bb7c1630ebe0bc69bdc66000cae26231616188da567a4f7b009bc71ee6daa Apr 17 16:30:17.954783 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:17.954761 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:30:18.557412 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:18.557365 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-55f9d656c8-xdws9" event={"ID":"72026730-e31b-4bb0-9c07-3932db13003d","Type":"ContainerStarted","Data":"510bb7c1630ebe0bc69bdc66000cae26231616188da567a4f7b009bc71ee6daa"} Apr 17 16:30:20.567969 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:20.567929 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-55f9d656c8-xdws9" event={"ID":"72026730-e31b-4bb0-9c07-3932db13003d","Type":"ContainerStarted","Data":"53feb9d703b18a1c5d07ed9b68a4347a2891444d98cec3712061f29ebd2d021b"} Apr 17 16:30:20.568468 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:20.568015 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:20.585195 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:20.585139 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-55f9d656c8-xdws9" podStartSLOduration=1.6537532179999999 podStartE2EDuration="3.585122934s" podCreationTimestamp="2026-04-17 16:30:17 +0000 UTC" firstStartedPulling="2026-04-17 16:30:17.954885198 +0000 UTC m=+604.335513537" lastFinishedPulling="2026-04-17 16:30:19.886254909 +0000 UTC m=+606.266883253" observedRunningTime="2026-04-17 16:30:20.582599943 +0000 UTC m=+606.963228305" watchObservedRunningTime="2026-04-17 16:30:20.585122934 +0000 UTC m=+606.965751295" Apr 17 16:30:26.577435 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:26.577346 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-55f9d656c8-xdws9" Apr 17 16:30:26.619630 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:26.619595 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-557764dc6-8zw85"] Apr 17 16:30:26.619868 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:26.619844 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-557764dc6-8zw85" podUID="c00dbd28-ba8a-49bb-ab1a-a752fc72c738" containerName="maas-api" containerID="cri-o://b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a" gracePeriod=30 Apr 17 16:30:26.880125 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:26.880095 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:30:26.994155 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:26.994118 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls\") pod \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " Apr 17 16:30:26.994371 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:26.994164 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-664rs\" (UniqueName: \"kubernetes.io/projected/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-kube-api-access-664rs\") pod \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\" (UID: \"c00dbd28-ba8a-49bb-ab1a-a752fc72c738\") " Apr 17 16:30:26.996303 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:26.996264 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "c00dbd28-ba8a-49bb-ab1a-a752fc72c738" (UID: "c00dbd28-ba8a-49bb-ab1a-a752fc72c738"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:30:26.996447 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:26.996363 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-kube-api-access-664rs" (OuterVolumeSpecName: "kube-api-access-664rs") pod "c00dbd28-ba8a-49bb-ab1a-a752fc72c738" (UID: "c00dbd28-ba8a-49bb-ab1a-a752fc72c738"). InnerVolumeSpecName "kube-api-access-664rs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:30:27.095027 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.094984 2569 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-maas-api-tls\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:30:27.095027 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.095022 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-664rs\" (UniqueName: \"kubernetes.io/projected/c00dbd28-ba8a-49bb-ab1a-a752fc72c738-kube-api-access-664rs\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:30:27.600374 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.600337 2569 generic.go:358] "Generic (PLEG): container finished" podID="c00dbd28-ba8a-49bb-ab1a-a752fc72c738" containerID="b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a" exitCode=0 Apr 17 16:30:27.600839 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.600399 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-557764dc6-8zw85" Apr 17 16:30:27.600839 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.600428 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-557764dc6-8zw85" event={"ID":"c00dbd28-ba8a-49bb-ab1a-a752fc72c738","Type":"ContainerDied","Data":"b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a"} Apr 17 16:30:27.600839 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.600480 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-557764dc6-8zw85" event={"ID":"c00dbd28-ba8a-49bb-ab1a-a752fc72c738","Type":"ContainerDied","Data":"481a064f84750d77f0b0cc8bd7afce7d58b845abd7ead95e91b56b19cb776177"} Apr 17 16:30:27.600839 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.600502 2569 scope.go:117] "RemoveContainer" containerID="b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a" Apr 17 16:30:27.609886 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.609869 2569 scope.go:117] "RemoveContainer" containerID="b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a" Apr 17 16:30:27.610144 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:30:27.610128 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a\": container with ID starting with b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a not found: ID does not exist" containerID="b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a" Apr 17 16:30:27.610189 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.610152 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a"} err="failed to get container status \"b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a\": rpc error: code = NotFound desc = could not find container \"b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a\": container with ID starting with b7e0d43211f1229664df2e73fadd9c680fdd71eaf40cb01aa0d57660262aca7a not found: ID does not exist" Apr 17 16:30:27.621543 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.621506 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-557764dc6-8zw85"] Apr 17 16:30:27.624011 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.623989 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-557764dc6-8zw85"] Apr 17 16:30:27.902998 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.902917 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj"] Apr 17 16:30:27.903358 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.903344 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c00dbd28-ba8a-49bb-ab1a-a752fc72c738" containerName="maas-api" Apr 17 16:30:27.903409 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.903363 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00dbd28-ba8a-49bb-ab1a-a752fc72c738" containerName="maas-api" Apr 17 16:30:27.903458 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.903448 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c00dbd28-ba8a-49bb-ab1a-a752fc72c738" containerName="maas-api" Apr 17 16:30:27.908003 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.907985 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:27.911471 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.911408 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 16:30:27.911613 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.911433 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-7xrks\"" Apr 17 16:30:27.911613 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.911439 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 16:30:27.911613 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.911439 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 16:30:27.913954 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:27.913931 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj"] Apr 17 16:30:28.104683 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.104648 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.104855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.104705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.104855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.104723 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f923ef57-0eb1-40c3-b4f1-281584825171-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.104855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.104750 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.104855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.104789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.104855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.104816 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtswg\" (UniqueName: \"kubernetes.io/projected/f923ef57-0eb1-40c3-b4f1-281584825171-kube-api-access-gtswg\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.205386 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.205296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.205386 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.205335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f923ef57-0eb1-40c3-b4f1-281584825171-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.205386 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.205359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.205673 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.205409 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.205673 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.205442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtswg\" (UniqueName: \"kubernetes.io/projected/f923ef57-0eb1-40c3-b4f1-281584825171-kube-api-access-gtswg\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.205673 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.205530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.205835 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.205781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.205913 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.205892 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.206123 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.206091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.208164 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.208142 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f923ef57-0eb1-40c3-b4f1-281584825171-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.208328 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.208301 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f923ef57-0eb1-40c3-b4f1-281584825171-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.222844 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.219946 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c00dbd28-ba8a-49bb-ab1a-a752fc72c738" path="/var/lib/kubelet/pods/c00dbd28-ba8a-49bb-ab1a-a752fc72c738/volumes" Apr 17 16:30:28.222844 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.222274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtswg\" (UniqueName: \"kubernetes.io/projected/f923ef57-0eb1-40c3-b4f1-281584825171-kube-api-access-gtswg\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj\" (UID: \"f923ef57-0eb1-40c3-b4f1-281584825171\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.520796 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.520704 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:28.653218 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:28.653187 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj"] Apr 17 16:30:28.654072 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:30:28.654045 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf923ef57_0eb1_40c3_b4f1_281584825171.slice/crio-98b77bbb81b45766fe93b89e406bbdf0a2f7d4f895429401c7a939860b3a62af WatchSource:0}: Error finding container 98b77bbb81b45766fe93b89e406bbdf0a2f7d4f895429401c7a939860b3a62af: Status 404 returned error can't find the container with id 98b77bbb81b45766fe93b89e406bbdf0a2f7d4f895429401c7a939860b3a62af Apr 17 16:30:29.613830 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:29.613782 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" event={"ID":"f923ef57-0eb1-40c3-b4f1-281584825171","Type":"ContainerStarted","Data":"98b77bbb81b45766fe93b89e406bbdf0a2f7d4f895429401c7a939860b3a62af"} Apr 17 16:30:32.209722 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.209675 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7"] Apr 17 16:30:32.233594 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.233560 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.236392 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.236362 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 16:30:32.241017 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.240992 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7"] Apr 17 16:30:32.342670 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.342627 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.342865 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.342725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8af66ab9-9738-4e24-be29-00deff869f9a-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.342865 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.342757 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.342865 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.342834 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.343097 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.342865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pcjq\" (UniqueName: \"kubernetes.io/projected/8af66ab9-9738-4e24-be29-00deff869f9a-kube-api-access-6pcjq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.343097 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.342913 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.443752 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.443715 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.443956 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.443792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.443956 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.443823 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pcjq\" (UniqueName: \"kubernetes.io/projected/8af66ab9-9738-4e24-be29-00deff869f9a-kube-api-access-6pcjq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.443956 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.443873 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.443956 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.443905 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.444194 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.443982 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8af66ab9-9738-4e24-be29-00deff869f9a-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.444282 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.444250 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.444508 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.444407 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.444629 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.444515 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.446202 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.446184 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8af66ab9-9738-4e24-be29-00deff869f9a-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.447427 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.446912 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8af66ab9-9738-4e24-be29-00deff869f9a-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.452423 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.452400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pcjq\" (UniqueName: \"kubernetes.io/projected/8af66ab9-9738-4e24-be29-00deff869f9a-kube-api-access-6pcjq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7\" (UID: \"8af66ab9-9738-4e24-be29-00deff869f9a\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:32.547573 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:32.547488 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:33.878327 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:33.878303 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7"] Apr 17 16:30:33.881417 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:30:33.881390 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8af66ab9_9738_4e24_be29_00deff869f9a.slice/crio-642af888bc6fbe6bff0ba1ae99eae1b36af5b1e39d1846308b45b49f7be27061 WatchSource:0}: Error finding container 642af888bc6fbe6bff0ba1ae99eae1b36af5b1e39d1846308b45b49f7be27061: Status 404 returned error can't find the container with id 642af888bc6fbe6bff0ba1ae99eae1b36af5b1e39d1846308b45b49f7be27061 Apr 17 16:30:34.638770 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:34.638726 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" event={"ID":"f923ef57-0eb1-40c3-b4f1-281584825171","Type":"ContainerStarted","Data":"63e8365f23ed102d6fc5c108a84be3d0ee6c2cfce11a31b1db917a55ea4aeff3"} Apr 17 16:30:34.640386 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:34.640356 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" event={"ID":"8af66ab9-9738-4e24-be29-00deff869f9a","Type":"ContainerStarted","Data":"3dabb6e0f32f92b15738afdea96f3ca73557ccc2eb6df801ea013904fccc3078"} Apr 17 16:30:34.640512 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:34.640392 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" event={"ID":"8af66ab9-9738-4e24-be29-00deff869f9a","Type":"ContainerStarted","Data":"642af888bc6fbe6bff0ba1ae99eae1b36af5b1e39d1846308b45b49f7be27061"} Apr 17 16:30:39.664549 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:39.664513 2569 generic.go:358] "Generic (PLEG): container finished" podID="8af66ab9-9738-4e24-be29-00deff869f9a" containerID="3dabb6e0f32f92b15738afdea96f3ca73557ccc2eb6df801ea013904fccc3078" exitCode=0 Apr 17 16:30:39.664929 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:39.664586 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" event={"ID":"8af66ab9-9738-4e24-be29-00deff869f9a","Type":"ContainerDied","Data":"3dabb6e0f32f92b15738afdea96f3ca73557ccc2eb6df801ea013904fccc3078"} Apr 17 16:30:39.666168 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:39.666147 2569 generic.go:358] "Generic (PLEG): container finished" podID="f923ef57-0eb1-40c3-b4f1-281584825171" containerID="63e8365f23ed102d6fc5c108a84be3d0ee6c2cfce11a31b1db917a55ea4aeff3" exitCode=0 Apr 17 16:30:39.666273 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:39.666193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" event={"ID":"f923ef57-0eb1-40c3-b4f1-281584825171","Type":"ContainerDied","Data":"63e8365f23ed102d6fc5c108a84be3d0ee6c2cfce11a31b1db917a55ea4aeff3"} Apr 17 16:30:44.693677 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:44.693632 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" event={"ID":"f923ef57-0eb1-40c3-b4f1-281584825171","Type":"ContainerStarted","Data":"66ca5bece1d8123d1869e0811a208281898f2a46bcf401b5a38fa058989a3c10"} Apr 17 16:30:44.694164 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:44.693915 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:44.695386 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:44.695364 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" event={"ID":"8af66ab9-9738-4e24-be29-00deff869f9a","Type":"ContainerStarted","Data":"49494f69c46756c049b3a10dd4bceb74c28bec188792079074a4eeab055fe0ca"} Apr 17 16:30:44.695556 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:44.695540 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:44.711049 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:44.711003 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" podStartSLOduration=2.218442654 podStartE2EDuration="17.710990972s" podCreationTimestamp="2026-04-17 16:30:27 +0000 UTC" firstStartedPulling="2026-04-17 16:30:28.655855162 +0000 UTC m=+615.036483501" lastFinishedPulling="2026-04-17 16:30:44.148403467 +0000 UTC m=+630.529031819" observedRunningTime="2026-04-17 16:30:44.709740337 +0000 UTC m=+631.090368697" watchObservedRunningTime="2026-04-17 16:30:44.710990972 +0000 UTC m=+631.091619345" Apr 17 16:30:44.726908 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:44.726860 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" podStartSLOduration=8.258872596 podStartE2EDuration="12.726848617s" podCreationTimestamp="2026-04-17 16:30:32 +0000 UTC" firstStartedPulling="2026-04-17 16:30:39.665399898 +0000 UTC m=+626.046028251" lastFinishedPulling="2026-04-17 16:30:44.133375923 +0000 UTC m=+630.514004272" observedRunningTime="2026-04-17 16:30:44.724744432 +0000 UTC m=+631.105372792" watchObservedRunningTime="2026-04-17 16:30:44.726848617 +0000 UTC m=+631.107476977" Apr 17 16:30:49.210705 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.210669 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4"] Apr 17 16:30:49.224131 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.224098 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4"] Apr 17 16:30:49.224333 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.224221 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.227021 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.226994 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 16:30:49.313973 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.313938 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.314155 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.313984 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d22d608-7963-49e3-b4aa-2983477e0cb7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.314155 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.314026 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.314155 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.314070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lxzc\" (UniqueName: \"kubernetes.io/projected/9d22d608-7963-49e3-b4aa-2983477e0cb7-kube-api-access-6lxzc\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.314155 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.314111 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.314155 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.314126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.415493 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.415457 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6lxzc\" (UniqueName: \"kubernetes.io/projected/9d22d608-7963-49e3-b4aa-2983477e0cb7-kube-api-access-6lxzc\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.415657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.415509 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.415657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.415537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.415657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.415609 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.415657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.415648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d22d608-7963-49e3-b4aa-2983477e0cb7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.415860 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.415687 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.416017 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.415991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.416089 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.416003 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.416139 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.416123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.417893 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.417864 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9d22d608-7963-49e3-b4aa-2983477e0cb7-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.418299 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.418278 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9d22d608-7963-49e3-b4aa-2983477e0cb7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.426112 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.426086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lxzc\" (UniqueName: \"kubernetes.io/projected/9d22d608-7963-49e3-b4aa-2983477e0cb7-kube-api-access-6lxzc\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4\" (UID: \"9d22d608-7963-49e3-b4aa-2983477e0cb7\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.537994 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.537899 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:49.668032 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.667998 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4"] Apr 17 16:30:49.670273 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:30:49.670212 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d22d608_7963_49e3_b4aa_2983477e0cb7.slice/crio-02151e2df654340cc5aeb74d5af9c3ae076b5ee687c237d2d41de28329fad0bf WatchSource:0}: Error finding container 02151e2df654340cc5aeb74d5af9c3ae076b5ee687c237d2d41de28329fad0bf: Status 404 returned error can't find the container with id 02151e2df654340cc5aeb74d5af9c3ae076b5ee687c237d2d41de28329fad0bf Apr 17 16:30:49.716842 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:49.716803 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" event={"ID":"9d22d608-7963-49e3-b4aa-2983477e0cb7","Type":"ContainerStarted","Data":"02151e2df654340cc5aeb74d5af9c3ae076b5ee687c237d2d41de28329fad0bf"} Apr 17 16:30:50.723276 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:50.723240 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" event={"ID":"9d22d608-7963-49e3-b4aa-2983477e0cb7","Type":"ContainerStarted","Data":"b894f3257580754c89815e5ed67ff6e7f8eb09d7c491fe06655785f558b8e756"} Apr 17 16:30:55.712820 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:55.712790 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj" Apr 17 16:30:55.713520 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:55.713501 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7" Apr 17 16:30:55.747715 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:55.747671 2569 generic.go:358] "Generic (PLEG): container finished" podID="9d22d608-7963-49e3-b4aa-2983477e0cb7" containerID="b894f3257580754c89815e5ed67ff6e7f8eb09d7c491fe06655785f558b8e756" exitCode=0 Apr 17 16:30:55.747916 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:55.747729 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" event={"ID":"9d22d608-7963-49e3-b4aa-2983477e0cb7","Type":"ContainerDied","Data":"b894f3257580754c89815e5ed67ff6e7f8eb09d7c491fe06655785f558b8e756"} Apr 17 16:30:56.753694 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:56.753660 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" event={"ID":"9d22d608-7963-49e3-b4aa-2983477e0cb7","Type":"ContainerStarted","Data":"aa30126bd0f1c9a16d63b1cd8929900b08c33d9b61622a11af59e7913573bee9"} Apr 17 16:30:56.754206 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:56.753886 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:30:56.774756 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:30:56.774706 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" podStartSLOduration=7.4025438470000005 podStartE2EDuration="7.774691738s" podCreationTimestamp="2026-04-17 16:30:49 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.748773263 +0000 UTC m=+642.129401604" lastFinishedPulling="2026-04-17 16:30:56.120921143 +0000 UTC m=+642.501549495" observedRunningTime="2026-04-17 16:30:56.772060549 +0000 UTC m=+643.152688921" watchObservedRunningTime="2026-04-17 16:30:56.774691738 +0000 UTC m=+643.155320098" Apr 17 16:31:07.771142 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:07.771105 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4" Apr 17 16:31:20.487549 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.487510 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7f6f748cc-zppx8"] Apr 17 16:31:20.488210 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.487709 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7f6f748cc-zppx8" podUID="3ee2df98-88cb-4cac-80e7-204599e8265c" containerName="authorino" containerID="cri-o://d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5" gracePeriod=30 Apr 17 16:31:20.735761 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.735731 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:31:20.799978 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.799891 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74mg8\" (UniqueName: \"kubernetes.io/projected/3ee2df98-88cb-4cac-80e7-204599e8265c-kube-api-access-74mg8\") pod \"3ee2df98-88cb-4cac-80e7-204599e8265c\" (UID: \"3ee2df98-88cb-4cac-80e7-204599e8265c\") " Apr 17 16:31:20.799978 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.799935 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3ee2df98-88cb-4cac-80e7-204599e8265c-tls-cert\") pod \"3ee2df98-88cb-4cac-80e7-204599e8265c\" (UID: \"3ee2df98-88cb-4cac-80e7-204599e8265c\") " Apr 17 16:31:20.802075 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.802035 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee2df98-88cb-4cac-80e7-204599e8265c-kube-api-access-74mg8" (OuterVolumeSpecName: "kube-api-access-74mg8") pod "3ee2df98-88cb-4cac-80e7-204599e8265c" (UID: "3ee2df98-88cb-4cac-80e7-204599e8265c"). InnerVolumeSpecName "kube-api-access-74mg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:31:20.810995 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.810962 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ee2df98-88cb-4cac-80e7-204599e8265c-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "3ee2df98-88cb-4cac-80e7-204599e8265c" (UID: "3ee2df98-88cb-4cac-80e7-204599e8265c"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:31:20.855782 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.855741 2569 generic.go:358] "Generic (PLEG): container finished" podID="3ee2df98-88cb-4cac-80e7-204599e8265c" containerID="d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5" exitCode=0 Apr 17 16:31:20.855952 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.855798 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7f6f748cc-zppx8" Apr 17 16:31:20.855952 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.855818 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7f6f748cc-zppx8" event={"ID":"3ee2df98-88cb-4cac-80e7-204599e8265c","Type":"ContainerDied","Data":"d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5"} Apr 17 16:31:20.855952 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.855858 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7f6f748cc-zppx8" event={"ID":"3ee2df98-88cb-4cac-80e7-204599e8265c","Type":"ContainerDied","Data":"844594899c26d6acdc4a23d183b585c254da1c474d856987f38ebd7ffc44e165"} Apr 17 16:31:20.855952 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.855874 2569 scope.go:117] "RemoveContainer" containerID="d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5" Apr 17 16:31:20.865282 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.865261 2569 scope.go:117] "RemoveContainer" containerID="d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5" Apr 17 16:31:20.865623 ip-10-0-136-214 kubenswrapper[2569]: E0417 16:31:20.865590 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5\": container with ID starting with d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5 not found: ID does not exist" containerID="d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5" Apr 17 16:31:20.865721 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.865658 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5"} err="failed to get container status \"d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5\": rpc error: code = NotFound desc = could not find container \"d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5\": container with ID starting with d40418b31fe38ff85cb204dde309b5c1b6690edd7d43d1f54f46d90653ddb0a5 not found: ID does not exist" Apr 17 16:31:20.877785 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.877754 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7f6f748cc-zppx8"] Apr 17 16:31:20.880183 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.880154 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7f6f748cc-zppx8"] Apr 17 16:31:20.901274 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.901244 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-74mg8\" (UniqueName: \"kubernetes.io/projected/3ee2df98-88cb-4cac-80e7-204599e8265c-kube-api-access-74mg8\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:31:20.901274 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:20.901271 2569 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3ee2df98-88cb-4cac-80e7-204599e8265c-tls-cert\") on node \"ip-10-0-136-214.ec2.internal\" DevicePath \"\"" Apr 17 16:31:22.217451 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:31:22.217415 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee2df98-88cb-4cac-80e7-204599e8265c" path="/var/lib/kubelet/pods/3ee2df98-88cb-4cac-80e7-204599e8265c/volumes" Apr 17 16:49:18.193594 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:18.193565 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-55f9d656c8-xdws9_72026730-e31b-4bb0-9c07-3932db13003d/maas-api/0.log" Apr 17 16:49:18.311408 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:18.311377 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5755dd7cbb-cln2h_a7adc5d2-9237-42d8-b35d-0f667707cd3a/manager/0.log" Apr 17 16:49:18.779707 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:18.779681 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54994d49cf-xncm4_177e44a7-ba5c-44b4-901b-5c0baa4df9fe/manager/0.log" Apr 17 16:49:18.884979 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:18.884945 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-swrsx_c5474b0d-7a8c-48cf-bb20-e3d94df6c617/postgres/0.log" Apr 17 16:49:19.612091 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.612060 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562_2ee32c6e-ade8-495c-96a8-b71e6126eaec/util/0.log" Apr 17 16:49:19.619316 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.619288 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562_2ee32c6e-ade8-495c-96a8-b71e6126eaec/pull/0.log" Apr 17 16:49:19.625417 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.625399 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562_2ee32c6e-ade8-495c-96a8-b71e6126eaec/extract/0.log" Apr 17 16:49:19.734660 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.734630 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn_15631fe2-e551-42b9-a8df-6a77f70d7753/util/0.log" Apr 17 16:49:19.741057 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.741033 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn_15631fe2-e551-42b9-a8df-6a77f70d7753/pull/0.log" Apr 17 16:49:19.747435 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.747416 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn_15631fe2-e551-42b9-a8df-6a77f70d7753/extract/0.log" Apr 17 16:49:19.853395 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.853368 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5_742cf387-cfb5-4d5a-93ef-8a54c1b98ec7/util/0.log" Apr 17 16:49:19.860380 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.860355 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5_742cf387-cfb5-4d5a-93ef-8a54c1b98ec7/pull/0.log" Apr 17 16:49:19.867136 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.867082 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5_742cf387-cfb5-4d5a-93ef-8a54c1b98ec7/extract/0.log" Apr 17 16:49:19.975104 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.975077 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x_5d31e936-9d34-43bf-acdb-2912119ed690/util/0.log" Apr 17 16:49:19.981889 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.981867 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x_5d31e936-9d34-43bf-acdb-2912119ed690/pull/0.log" Apr 17 16:49:19.988696 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:19.988675 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x_5d31e936-9d34-43bf-acdb-2912119ed690/extract/0.log" Apr 17 16:49:20.533817 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:20.533786 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-x66nk_3289d8f3-9ffd-4bb2-965e-65978d3a1a33/registry-server/0.log" Apr 17 16:49:20.655191 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:20.655159 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-shv97_eb4a0194-736b-4b69-ab5a-32731bbecb8c/manager/0.log" Apr 17 16:49:21.207318 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:21.207291 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv_f5604b22-37d7-4787-9612-bca485e7867d/istio-proxy/0.log" Apr 17 16:49:21.643963 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:21.643928 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-9rngn_f8651809-7728-4f9e-ac55-a5441cb3d52d/istio-proxy/0.log" Apr 17 16:49:22.081098 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:22.081070 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj_f923ef57-0eb1-40c3-b4f1-281584825171/storage-initializer/0.log" Apr 17 16:49:22.088928 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:22.088907 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-pv2zj_f923ef57-0eb1-40c3-b4f1-281584825171/main/0.log" Apr 17 16:49:22.302809 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:22.302776 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7_8af66ab9-9738-4e24-be29-00deff869f9a/storage-initializer/0.log" Apr 17 16:49:22.313376 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:22.313345 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccslfg7_8af66ab9-9738-4e24-be29-00deff869f9a/main/0.log" Apr 17 16:49:22.422025 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:22.421932 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4_9d22d608-7963-49e3-b4aa-2983477e0cb7/storage-initializer/0.log" Apr 17 16:49:22.434209 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:22.434181 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-pr4q4_9d22d608-7963-49e3-b4aa-2983477e0cb7/main/0.log" Apr 17 16:49:29.691649 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:29.691614 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-b6drk_338e932a-1f5c-4a6b-8be1-288700fd3608/global-pull-secret-syncer/0.log" Apr 17 16:49:29.811805 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:29.811776 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pr4pk_3d4f3209-9daa-4cca-9236-5918fad01d8d/konnectivity-agent/0.log" Apr 17 16:49:29.937570 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:29.937541 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-214.ec2.internal_06511d0037f371d29f77ad6b941c9dbf/haproxy/0.log" Apr 17 16:49:33.536808 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.536772 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562_2ee32c6e-ade8-495c-96a8-b71e6126eaec/extract/0.log" Apr 17 16:49:33.559279 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.559248 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562_2ee32c6e-ade8-495c-96a8-b71e6126eaec/util/0.log" Apr 17 16:49:33.579841 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.579805 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599l562_2ee32c6e-ade8-495c-96a8-b71e6126eaec/pull/0.log" Apr 17 16:49:33.615273 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.615207 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn_15631fe2-e551-42b9-a8df-6a77f70d7753/extract/0.log" Apr 17 16:49:33.635022 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.634990 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn_15631fe2-e551-42b9-a8df-6a77f70d7753/util/0.log" Apr 17 16:49:33.654943 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.654914 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dgtjn_15631fe2-e551-42b9-a8df-6a77f70d7753/pull/0.log" Apr 17 16:49:33.694346 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.694304 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5_742cf387-cfb5-4d5a-93ef-8a54c1b98ec7/extract/0.log" Apr 17 16:49:33.713448 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.713421 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5_742cf387-cfb5-4d5a-93ef-8a54c1b98ec7/util/0.log" Apr 17 16:49:33.733159 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.733137 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73xcts5_742cf387-cfb5-4d5a-93ef-8a54c1b98ec7/pull/0.log" Apr 17 16:49:33.763605 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.763578 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x_5d31e936-9d34-43bf-acdb-2912119ed690/extract/0.log" Apr 17 16:49:33.790341 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.790261 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x_5d31e936-9d34-43bf-acdb-2912119ed690/util/0.log" Apr 17 16:49:33.813256 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:33.813201 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18vm2x_5d31e936-9d34-43bf-acdb-2912119ed690/pull/0.log" Apr 17 16:49:34.163060 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:34.163017 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-x66nk_3289d8f3-9ffd-4bb2-965e-65978d3a1a33/registry-server/0.log" Apr 17 16:49:34.284337 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:34.284281 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-shv97_eb4a0194-736b-4b69-ab5a-32731bbecb8c/manager/0.log" Apr 17 16:49:35.740621 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:35.740588 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-rtp7q_487fdfa2-04ee-41df-9603-b59486486e7e/cluster-monitoring-operator/0.log" Apr 17 16:49:35.881987 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:35.881944 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5fdb8f4b6c-5f6rz_f59d2ca0-0063-4dfb-bb47-dc1e456cc4b2/metrics-server/0.log" Apr 17 16:49:36.077261 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.077219 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v5n6c_4b73a405-15f3-43c5-bf6a-43a8219a181a/node-exporter/0.log" Apr 17 16:49:36.122321 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.122293 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v5n6c_4b73a405-15f3-43c5-bf6a-43a8219a181a/kube-rbac-proxy/0.log" Apr 17 16:49:36.142734 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.142705 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v5n6c_4b73a405-15f3-43c5-bf6a-43a8219a181a/init-textfile/0.log" Apr 17 16:49:36.436890 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.436782 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jdrxz_b2347694-fcef-49a0-9562-dc4a50f629e0/prometheus-operator/0.log" Apr 17 16:49:36.460158 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.460126 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jdrxz_b2347694-fcef-49a0-9562-dc4a50f629e0/kube-rbac-proxy/0.log" Apr 17 16:49:36.484139 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.484111 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-xzkcw_fdd67f61-79df-48f1-af95-85e192096fa7/prometheus-operator-admission-webhook/0.log" Apr 17 16:49:36.511938 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.511901 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-78bf64869c-bbggk_509250ea-5b8f-46b2-9140-e92b0d75346e/telemeter-client/0.log" Apr 17 16:49:36.539335 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.539311 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-78bf64869c-bbggk_509250ea-5b8f-46b2-9140-e92b0d75346e/reload/0.log" Apr 17 16:49:36.563126 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.563100 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-78bf64869c-bbggk_509250ea-5b8f-46b2-9140-e92b0d75346e/kube-rbac-proxy/0.log" Apr 17 16:49:36.592620 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.592594 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f4f5c6b8-6cdhc_058c398d-5b48-483c-bc96-bd8e2f9f3bc3/thanos-query/0.log" Apr 17 16:49:36.612181 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.612154 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f4f5c6b8-6cdhc_058c398d-5b48-483c-bc96-bd8e2f9f3bc3/kube-rbac-proxy-web/0.log" Apr 17 16:49:36.638719 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.638691 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f4f5c6b8-6cdhc_058c398d-5b48-483c-bc96-bd8e2f9f3bc3/kube-rbac-proxy/0.log" Apr 17 16:49:36.656240 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.656212 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f4f5c6b8-6cdhc_058c398d-5b48-483c-bc96-bd8e2f9f3bc3/prom-label-proxy/0.log" Apr 17 16:49:36.676953 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.676925 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f4f5c6b8-6cdhc_058c398d-5b48-483c-bc96-bd8e2f9f3bc3/kube-rbac-proxy-rules/0.log" Apr 17 16:49:36.696246 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:36.696165 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-64f4f5c6b8-6cdhc_058c398d-5b48-483c-bc96-bd8e2f9f3bc3/kube-rbac-proxy-metrics/0.log" Apr 17 16:49:38.484619 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.484588 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc"] Apr 17 16:49:38.485005 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.484992 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ee2df98-88cb-4cac-80e7-204599e8265c" containerName="authorino" Apr 17 16:49:38.485050 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.485007 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee2df98-88cb-4cac-80e7-204599e8265c" containerName="authorino" Apr 17 16:49:38.485086 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.485080 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ee2df98-88cb-4cac-80e7-204599e8265c" containerName="authorino" Apr 17 16:49:38.488183 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.488167 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.490757 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.490734 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98l5c\"/\"kube-root-ca.crt\"" Apr 17 16:49:38.490887 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.490872 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98l5c\"/\"openshift-service-ca.crt\"" Apr 17 16:49:38.491924 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.491908 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-98l5c\"/\"default-dockercfg-j7zpj\"" Apr 17 16:49:38.498704 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.498683 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc"] Apr 17 16:49:38.572336 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.572306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-proc\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.572336 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.572344 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhfnw\" (UniqueName: \"kubernetes.io/projected/a15df6ac-1dfb-460b-ab28-53a70141a456-kube-api-access-zhfnw\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.572605 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.572384 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-podres\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.572605 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.572417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-sys\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.572605 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.572437 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-lib-modules\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.673614 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673577 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-proc\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.673614 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhfnw\" (UniqueName: \"kubernetes.io/projected/a15df6ac-1dfb-460b-ab28-53a70141a456-kube-api-access-zhfnw\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.673855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-podres\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.673855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673681 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-sys\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.673855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673701 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-lib-modules\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.673855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673719 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-proc\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.673855 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673839 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-podres\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.674074 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673851 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-sys\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.674074 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.673865 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a15df6ac-1dfb-460b-ab28-53a70141a456-lib-modules\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.681810 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.681787 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhfnw\" (UniqueName: \"kubernetes.io/projected/a15df6ac-1dfb-460b-ab28-53a70141a456-kube-api-access-zhfnw\") pod \"perf-node-gather-daemonset-82dsc\" (UID: \"a15df6ac-1dfb-460b-ab28-53a70141a456\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.799891 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.799794 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:38.925130 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.925098 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc"] Apr 17 16:49:38.926445 ip-10-0-136-214 kubenswrapper[2569]: W0417 16:49:38.926415 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda15df6ac_1dfb_460b_ab28_53a70141a456.slice/crio-8ec80d2ee257010f0be95175dc579a82052ce5c03c483856f896e785c05a6778 WatchSource:0}: Error finding container 8ec80d2ee257010f0be95175dc579a82052ce5c03c483856f896e785c05a6778: Status 404 returned error can't find the container with id 8ec80d2ee257010f0be95175dc579a82052ce5c03c483856f896e785c05a6778 Apr 17 16:49:38.928192 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:38.928171 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:49:39.048701 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:39.048661 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f47b6d896-zckpn_0b95e642-d03d-4713-923a-ac3c5b1ff4db/console/0.log" Apr 17 16:49:39.081880 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:39.081257 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-kkz7m_194f3893-97c5-431f-b1a4-de3bc7c0dcbe/download-server/0.log" Apr 17 16:49:39.371255 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:39.371153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" event={"ID":"a15df6ac-1dfb-460b-ab28-53a70141a456","Type":"ContainerStarted","Data":"81de27ab58de87811ae099cd7f0b8ed8d337770219342d7affa558fde4ea530e"} Apr 17 16:49:39.371255 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:39.371190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" event={"ID":"a15df6ac-1dfb-460b-ab28-53a70141a456","Type":"ContainerStarted","Data":"8ec80d2ee257010f0be95175dc579a82052ce5c03c483856f896e785c05a6778"} Apr 17 16:49:39.371255 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:39.371223 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:39.388989 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:39.388926 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" podStartSLOduration=1.388907869 podStartE2EDuration="1.388907869s" podCreationTimestamp="2026-04-17 16:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:49:39.387614671 +0000 UTC m=+1765.768243056" watchObservedRunningTime="2026-04-17 16:49:39.388907869 +0000 UTC m=+1765.769536232" Apr 17 16:49:39.557093 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:39.557056 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-44ks8_4fd6f5cc-5c82-4053-9185-3fd37df03519/volume-data-source-validator/0.log" Apr 17 16:49:40.370762 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:40.370737 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fh45r_ccfa7faf-8272-48ab-b2fa-20b063c3b4ad/dns/0.log" Apr 17 16:49:40.394589 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:40.394564 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fh45r_ccfa7faf-8272-48ab-b2fa-20b063c3b4ad/kube-rbac-proxy/0.log" Apr 17 16:49:40.459077 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:40.459047 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8vckd_919c2101-3bb9-439c-89fe-f84487ea8e6d/dns-node-resolver/0.log" Apr 17 16:49:41.021968 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:41.021922 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fptfr_3237eb23-86ef-44a2-98cb-f37d4d9fb915/node-ca/0.log" Apr 17 16:49:41.879686 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:41.879654 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf64kpv_f5604b22-37d7-4787-9612-bca485e7867d/istio-proxy/0.log" Apr 17 16:49:42.065799 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:42.065765 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-9rngn_f8651809-7728-4f9e-ac55-a5441cb3d52d/istio-proxy/0.log" Apr 17 16:49:42.612528 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:42.612497 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-h5d4m_dce4e627-2afb-4861-8b1a-4bf531c0f4a7/serve-healthcheck-canary/0.log" Apr 17 16:49:43.135200 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:43.135171 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gz7tf_bdfda63a-84fe-48c6-817f-4ccccdc6ceae/kube-rbac-proxy/0.log" Apr 17 16:49:43.159809 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:43.159780 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gz7tf_bdfda63a-84fe-48c6-817f-4ccccdc6ceae/exporter/0.log" Apr 17 16:49:43.179840 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:43.179807 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gz7tf_bdfda63a-84fe-48c6-817f-4ccccdc6ceae/extractor/0.log" Apr 17 16:49:45.144857 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:45.144825 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-55f9d656c8-xdws9_72026730-e31b-4bb0-9c07-3932db13003d/maas-api/0.log" Apr 17 16:49:45.192547 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:45.192513 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5755dd7cbb-cln2h_a7adc5d2-9237-42d8-b35d-0f667707cd3a/manager/0.log" Apr 17 16:49:45.315921 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:45.315890 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54994d49cf-xncm4_177e44a7-ba5c-44b4-901b-5c0baa4df9fe/manager/0.log" Apr 17 16:49:45.345848 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:45.345817 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-swrsx_c5474b0d-7a8c-48cf-bb20-e3d94df6c617/postgres/0.log" Apr 17 16:49:45.385383 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:45.385357 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-82dsc" Apr 17 16:49:46.443838 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:46.443812 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7bf4f445d7-sz9l2_46da63db-1f77-4978-a1ab-a8bd31c70bb0/manager/0.log" Apr 17 16:49:46.495022 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:46.494981 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-nvt7l_e342dd7d-02bc-41b4-a004-04300ce0d1e5/openshift-lws-operator/0.log" Apr 17 16:49:50.959875 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:50.959845 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dtwws_2e21120d-2f0e-4730-ad47-2a2a7275109d/migrator/0.log" Apr 17 16:49:50.980515 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:50.980490 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-dtwws_2e21120d-2f0e-4730-ad47-2a2a7275109d/graceful-termination/0.log" Apr 17 16:49:52.529731 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.529699 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vfhjq_3dfa7029-ad7c-4849-aaf2-9516b86babac/kube-multus-additional-cni-plugins/0.log" Apr 17 16:49:52.551371 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.551342 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vfhjq_3dfa7029-ad7c-4849-aaf2-9516b86babac/egress-router-binary-copy/0.log" Apr 17 16:49:52.571582 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.571555 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vfhjq_3dfa7029-ad7c-4849-aaf2-9516b86babac/cni-plugins/0.log" Apr 17 16:49:52.590908 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.590882 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vfhjq_3dfa7029-ad7c-4849-aaf2-9516b86babac/bond-cni-plugin/0.log" Apr 17 16:49:52.611192 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.611168 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vfhjq_3dfa7029-ad7c-4849-aaf2-9516b86babac/routeoverride-cni/0.log" Apr 17 16:49:52.631932 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.631906 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vfhjq_3dfa7029-ad7c-4849-aaf2-9516b86babac/whereabouts-cni-bincopy/0.log" Apr 17 16:49:52.657534 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.657503 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vfhjq_3dfa7029-ad7c-4849-aaf2-9516b86babac/whereabouts-cni/0.log" Apr 17 16:49:52.868872 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.868843 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jswsr_9a537c40-6a2e-4250-8d81-dfa908f4f536/kube-multus/0.log" Apr 17 16:49:52.999274 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:52.999244 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tfgvs_b74a4398-a3fb-40e5-b014-d968d4c10069/network-metrics-daemon/0.log" Apr 17 16:49:53.017566 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:53.017533 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tfgvs_b74a4398-a3fb-40e5-b014-d968d4c10069/kube-rbac-proxy/0.log" Apr 17 16:49:54.068073 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:54.068046 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpc4t_fb03560e-c45d-4041-b046-c5c9b2fd22a8/ovn-controller/0.log" Apr 17 16:49:54.097099 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:54.097071 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpc4t_fb03560e-c45d-4041-b046-c5c9b2fd22a8/ovn-acl-logging/0.log" Apr 17 16:49:54.113908 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:54.113887 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpc4t_fb03560e-c45d-4041-b046-c5c9b2fd22a8/kube-rbac-proxy-node/0.log" Apr 17 16:49:54.135896 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:54.135873 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpc4t_fb03560e-c45d-4041-b046-c5c9b2fd22a8/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:49:54.153338 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:54.153314 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpc4t_fb03560e-c45d-4041-b046-c5c9b2fd22a8/northd/0.log" Apr 17 16:49:54.174393 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:54.174369 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpc4t_fb03560e-c45d-4041-b046-c5c9b2fd22a8/nbdb/0.log" Apr 17 16:49:54.194657 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:54.194624 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpc4t_fb03560e-c45d-4041-b046-c5c9b2fd22a8/sbdb/0.log" Apr 17 16:49:54.300181 ip-10-0-136-214 kubenswrapper[2569]: I0417 16:49:54.300121 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpc4t_fb03560e-c45d-4041-b046-c5c9b2fd22a8/ovnkube-controller/0.log"