Apr 17 16:50:02.182825 ip-10-0-138-47 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:50:02.182840 ip-10-0-138-47 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:50:02.182850 ip-10-0-138-47 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:50:02.183049 ip-10-0-138-47 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:50:12.397591 ip-10-0-138-47 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:50:12.397613 ip-10-0-138-47 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot ba066051d7b346098b4fa9bf839a2bd1 -- Apr 17 16:52:40.483092 ip-10-0-138-47 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:52:40.911688 ip-10-0-138-47 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:40.911688 ip-10-0-138-47 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:52:40.911688 ip-10-0-138-47 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:40.911688 ip-10-0-138-47 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:52:40.911688 ip-10-0-138-47 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:52:40.913226 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.913133 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:52:40.916273 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916257 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:40.916273 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916273 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916277 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916281 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916283 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916286 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916289 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916292 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916295 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916297 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916300 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916304 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916308 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916311 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916314 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916317 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916319 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916322 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916330 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916332 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:40.916335 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916335 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916338 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916340 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916343 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916346 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916349 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916352 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916356 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916360 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916363 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916365 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916368 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916371 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916373 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916376 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916378 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916381 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916383 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916386 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916388 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:40.916793 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916391 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916393 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916395 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916398 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916400 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916403 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916405 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916408 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916410 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916413 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916415 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916418 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916420 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916423 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916426 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916429 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916432 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916434 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916437 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916440 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:40.917319 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916442 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916445 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916447 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916450 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916452 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916455 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916457 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916460 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916462 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916465 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916468 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916472 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916475 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916477 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916480 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916482 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916485 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916487 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916489 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916492 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:40.917815 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916494 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916497 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916499 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916501 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916504 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916506 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916908 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916913 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916916 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916919 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916923 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916925 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916929 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916932 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916934 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916937 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916940 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916942 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916946 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:40.918311 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916949 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916951 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916954 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916956 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916959 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916961 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916964 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916966 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916969 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916971 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916974 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916976 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916979 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916981 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916984 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916986 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916991 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916994 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.916997 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:40.918809 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917001 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917004 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917007 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917010 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917012 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917015 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917017 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917020 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917022 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917025 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917028 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917030 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917033 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917036 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917038 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917041 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917043 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917045 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917048 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917051 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:40.919277 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917053 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917056 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917059 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917061 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917065 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917069 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917072 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917075 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917077 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917080 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917083 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917086 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917088 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917090 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917093 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917095 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917097 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917100 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917102 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917105 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:40.919784 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917108 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917110 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917113 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917115 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917117 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917120 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917123 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917126 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917129 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917131 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917134 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917137 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917139 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.917141 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917769 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917782 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917789 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917794 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917799 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917802 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917807 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:52:40.920274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917811 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917814 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917818 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917822 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917826 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917829 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917832 2574 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917835 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917838 2574 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917841 2574 flags.go:64] FLAG: --cloud-config="" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917843 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917846 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917851 2574 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917864 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917867 2574 flags.go:64] FLAG: --config-dir="" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917870 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917874 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917878 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917882 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917886 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917889 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917892 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917895 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917898 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917901 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:52:40.920806 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917904 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917909 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917911 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917914 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917917 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917920 2574 flags.go:64] FLAG: --enable-server="true" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917924 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917928 2574 flags.go:64] FLAG: --event-burst="100" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917931 2574 flags.go:64] FLAG: --event-qps="50" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917934 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917937 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917940 2574 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917944 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917946 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917949 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917952 2574 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917955 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917958 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917961 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917963 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917967 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917969 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917972 2574 flags.go:64] FLAG: --feature-gates="" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917976 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917980 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:52:40.921409 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917983 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917992 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917995 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.917998 2574 flags.go:64] FLAG: --help="false" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918001 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-138-47.ec2.internal" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918004 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918007 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918010 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918013 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918017 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918020 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918022 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918025 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918028 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918031 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918034 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918037 2574 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918040 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918043 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918046 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918049 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918051 2574 flags.go:64] FLAG: --lock-file="" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918054 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918057 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:52:40.922085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918060 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918065 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918068 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918071 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918074 2574 flags.go:64] FLAG: --logging-format="text" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918076 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918080 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918082 2574 flags.go:64] FLAG: --manifest-url="" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918085 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918091 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918101 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918109 2574 flags.go:64] FLAG: --max-pods="110" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918112 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918115 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918118 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918121 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918124 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918127 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918130 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918137 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918140 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918143 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918146 2574 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:52:40.922689 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918149 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918156 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918158 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918162 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918164 2574 flags.go:64] FLAG: --port="10250" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918167 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918170 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e1a8536473e3fdbd" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918174 2574 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918176 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918179 2574 flags.go:64] FLAG: --register-node="true" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918182 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918185 2574 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918189 2574 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918191 2574 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918194 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918197 2574 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918201 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918204 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918207 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918211 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918214 2574 flags.go:64] FLAG: --runonce="false" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918217 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918220 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918223 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918226 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918229 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:52:40.923261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918232 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918235 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918238 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918241 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918244 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918247 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918249 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918252 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918255 2574 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918258 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918263 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918266 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918269 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918274 2574 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918276 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918279 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918282 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918285 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918288 2574 flags.go:64] FLAG: --v="2" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918292 2574 flags.go:64] FLAG: --version="false" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918296 2574 flags.go:64] FLAG: --vmodule="" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918300 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.918303 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918409 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:40.923925 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918413 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918416 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918419 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918423 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918426 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918429 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918432 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918436 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918439 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918442 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918444 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918447 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918449 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918452 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918454 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918456 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918459 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918462 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918464 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918467 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:40.924496 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918469 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918472 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918474 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918476 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918479 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918481 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918484 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918486 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918489 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918491 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918494 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918496 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918499 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918501 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918504 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918507 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918513 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918516 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918518 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918522 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:40.925071 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918524 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918527 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918529 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918532 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918534 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918536 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918539 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918541 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918544 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918546 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918549 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918551 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918553 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918556 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918558 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918561 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918563 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918566 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918568 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:40.925572 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918571 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918573 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918577 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918581 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918583 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918586 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918588 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918591 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918594 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918596 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918604 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918607 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918611 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918613 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918616 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918618 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918621 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918623 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918626 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918628 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:40.926072 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918631 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:40.926561 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918635 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:40.926561 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918639 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:40.926561 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918642 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:40.926561 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918644 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:40.926561 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.918647 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:40.926561 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.919224 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:40.926951 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.926931 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:52:40.926981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.926952 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:52:40.927012 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927002 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:40.927012 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927008 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:40.927012 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927011 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927014 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927017 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927020 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927024 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927028 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927031 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927034 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927036 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927039 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927041 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927044 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927046 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927049 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927051 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927054 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927073 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927077 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927080 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927084 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:40.927087 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927087 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927090 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927093 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927096 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927099 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927102 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927104 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927107 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927110 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927113 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927115 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927118 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927121 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927123 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927126 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927128 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927131 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927133 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927136 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927138 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:40.927582 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927140 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927143 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927145 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927148 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927150 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927153 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927155 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927158 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927160 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927163 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927165 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927168 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927170 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927172 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927176 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927179 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927181 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927184 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927186 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:40.928098 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927191 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927195 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927198 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927201 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927204 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927206 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927209 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927211 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927213 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927216 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927218 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927221 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927223 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927226 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927229 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927231 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927233 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927236 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927238 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:40.928593 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927241 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927244 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927246 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927248 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927251 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927253 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.927258 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927361 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927366 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927370 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927374 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927377 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927380 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927383 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927385 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:52:40.929073 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927388 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927390 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927393 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927396 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927398 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927401 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927403 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927406 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927409 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927411 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927413 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927416 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927420 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927423 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927426 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927429 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927432 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927435 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927438 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:52:40.929440 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927440 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927443 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927446 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927448 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927450 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927453 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927456 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927458 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927461 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927463 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927466 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927468 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927471 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927473 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927476 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927478 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927481 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927483 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927486 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927488 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:52:40.929936 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927491 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927493 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927496 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927498 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927501 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927503 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927506 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927508 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927511 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927513 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927516 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927518 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927521 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927524 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927526 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927529 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927531 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927534 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927536 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927538 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:52:40.930422 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927541 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927543 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927546 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927548 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927551 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927553 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927556 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927558 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927560 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927564 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927567 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927569 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927572 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927574 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927577 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927579 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927582 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927584 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:52:40.930924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:40.927587 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:52:40.931352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.927604 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:52:40.931352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.927732 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:52:40.931352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.929944 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:52:40.931352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.930845 2574 server.go:1019] "Starting client certificate rotation" Apr 17 16:52:40.931352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.930945 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:52:40.931352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.930984 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:52:40.952831 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.952808 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:52:40.956796 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.956765 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:52:40.968313 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.968294 2574 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:52:40.974598 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.974576 2574 log.go:25] "Validated CRI v1 image API" Apr 17 16:52:40.979024 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.979005 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:52:40.979095 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.979030 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:52:40.983624 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.983597 2574 fs.go:135] Filesystem UUIDs: map[18f84663-6ebd-479f-b268-120c9c0bdc91:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fe7996f5-22ee-4980-a432-15d14015ba00:/dev/nvme0n1p3] Apr 17 16:52:40.983703 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.983625 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:52:40.989834 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.989723 2574 manager.go:217] Machine: {Timestamp:2026-04-17 16:52:40.987852904 +0000 UTC m=+0.387281432 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3106980 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2972bb84b019ed518000fc0f948c6a SystemUUID:ec2972bb-84b0-19ed-5180-00fc0f948c6a BootID:ba066051-d7b3-4609-8b4f-a9bf839a2bd1 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:55:9c:80:8d:f5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:55:9c:80:8d:f5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:37:01:6f:2a:d9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:52:40.990441 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.990429 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:52:40.990538 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.990526 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:52:40.991602 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.991576 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:52:40.991767 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.991605 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-47.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:52:40.991815 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.991777 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:52:40.991815 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.991785 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:52:40.991815 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.991802 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:52:40.993348 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.993336 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:52:40.995239 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.995226 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:52:40.995348 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.995339 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:52:40.997618 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.997609 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:52:40.997666 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.997623 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:52:40.997666 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.997642 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:52:40.997666 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.997664 2574 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:52:40.997782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.997673 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:52:40.998727 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.998711 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:52:40.998727 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:40.998731 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:52:41.001804 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.001787 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:52:41.003713 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.003696 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v7lhm" Apr 17 16:52:41.003865 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.003851 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:52:41.008489 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008466 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:52:41.008579 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008498 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:52:41.008579 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008509 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:52:41.008579 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008553 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:52:41.008579 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008563 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:52:41.008579 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008573 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:52:41.008840 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008582 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:52:41.008840 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008591 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:52:41.008840 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008604 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:52:41.008840 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008614 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:52:41.008840 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008641 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:52:41.008840 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.008680 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:52:41.009524 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.009509 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:52:41.009647 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.009530 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:52:41.010603 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.010578 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:52:41.010722 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.010696 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:52:41.013555 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.013542 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:52:41.013605 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.013583 2574 server.go:1295] "Started kubelet" Apr 17 16:52:41.013757 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.013733 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v7lhm" Apr 17 16:52:41.013837 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.013720 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:52:41.013837 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.013809 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:52:41.014085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.014059 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:52:41.014474 ip-10-0-138-47 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:52:41.014955 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.014900 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-47.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:52:41.016197 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.016168 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:52:41.016542 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.016528 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:52:41.022086 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.022049 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:52:41.023437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.023421 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:52:41.023945 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.023884 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:52:41.024555 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.024538 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:52:41.024681 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.024554 2574 factory.go:55] Registering systemd factory Apr 17 16:52:41.024779 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.024687 2574 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:52:41.024779 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.024574 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:52:41.024779 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.024612 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.024779 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.024748 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:52:41.024982 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.024901 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:52:41.024982 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.024910 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:52:41.025105 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.025092 2574 factory.go:153] Registering CRI-O factory Apr 17 16:52:41.025153 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.025109 2574 factory.go:223] Registration of the crio container factory successfully Apr 17 16:52:41.025198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.025158 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:52:41.025198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.025183 2574 factory.go:103] Registering Raw factory Apr 17 16:52:41.025198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.025196 2574 manager.go:1196] Started watching for new ooms in manager Apr 17 16:52:41.025608 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.025587 2574 manager.go:319] Starting recovery of all containers Apr 17 16:52:41.026038 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.026012 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:41.028158 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.028132 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-47.ec2.internal\" not found" node="ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.035905 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.035891 2574 manager.go:324] Recovery completed Apr 17 16:52:41.041640 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.041624 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:41.045723 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.045704 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:41.045800 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.045734 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:41.045800 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.045746 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:41.046243 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.046231 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:52:41.046243 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.046242 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:52:41.046314 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.046258 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:52:41.049895 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.049880 2574 policy_none.go:49] "None policy: Start" Apr 17 16:52:41.049965 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.049900 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:52:41.049965 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.049915 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:52:41.089248 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.089231 2574 manager.go:341] "Starting Device Plugin manager" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.089263 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.089273 2574 server.go:85] "Starting device plugin registration server" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.089518 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.089530 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.089630 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.089721 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.089730 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.090282 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:52:41.114356 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.090320 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.160638 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.160600 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:52:41.161972 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.161908 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:52:41.161972 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.161934 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:52:41.161972 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.161953 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:52:41.161972 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.161960 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:52:41.162205 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.162056 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:52:41.174412 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.174384 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:41.190072 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.190041 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:41.190956 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.190942 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:41.191033 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.190970 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:41.191033 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.190981 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:41.191033 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.191013 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.196777 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.196764 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.196819 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.196785 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-47.ec2.internal\": node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.220773 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.220748 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.262844 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.262807 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal"] Apr 17 16:52:41.262928 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.262917 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:41.263862 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.263849 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:41.263911 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.263877 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:41.263911 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.263886 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:41.266166 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266153 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:41.266326 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266313 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.266367 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266339 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:41.266873 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266840 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:41.266941 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266886 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:41.266941 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266898 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:41.267005 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266862 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:41.267005 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266958 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:41.267005 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.266968 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:41.269099 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.269086 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.269143 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.269107 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:52:41.269948 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.269932 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:52:41.270054 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.269963 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:52:41.270054 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.269978 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:52:41.291310 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.291291 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-47.ec2.internal\" not found" node="ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.295537 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.295522 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-47.ec2.internal\" not found" node="ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.321321 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.321300 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.326708 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.326688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f1cf25a30bb24672cb7e749f4e426384-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal\" (UID: \"f1cf25a30bb24672cb7e749f4e426384\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.326811 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.326717 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1cf25a30bb24672cb7e749f4e426384-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal\" (UID: \"f1cf25a30bb24672cb7e749f4e426384\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.326811 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.326735 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c4eaf6115bb9effc38fe56b1daa6a65-config\") pod \"kube-apiserver-proxy-ip-10-0-138-47.ec2.internal\" (UID: \"7c4eaf6115bb9effc38fe56b1daa6a65\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.421443 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.421375 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.427834 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.427816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f1cf25a30bb24672cb7e749f4e426384-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal\" (UID: \"f1cf25a30bb24672cb7e749f4e426384\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.427891 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.427846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1cf25a30bb24672cb7e749f4e426384-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal\" (UID: \"f1cf25a30bb24672cb7e749f4e426384\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.427891 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.427867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c4eaf6115bb9effc38fe56b1daa6a65-config\") pod \"kube-apiserver-proxy-ip-10-0-138-47.ec2.internal\" (UID: \"7c4eaf6115bb9effc38fe56b1daa6a65\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.427951 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.427912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7c4eaf6115bb9effc38fe56b1daa6a65-config\") pod \"kube-apiserver-proxy-ip-10-0-138-47.ec2.internal\" (UID: \"7c4eaf6115bb9effc38fe56b1daa6a65\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.427951 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.427913 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1cf25a30bb24672cb7e749f4e426384-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal\" (UID: \"f1cf25a30bb24672cb7e749f4e426384\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.427951 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.427913 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f1cf25a30bb24672cb7e749f4e426384-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal\" (UID: \"f1cf25a30bb24672cb7e749f4e426384\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.521697 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.521628 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.594139 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.594108 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.599108 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.599091 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" Apr 17 16:52:41.622446 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.622273 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.722945 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.722847 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.823365 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.823328 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.923975 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:41.923939 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-47.ec2.internal\" not found" Apr 17 16:52:41.931193 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.931172 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:52:41.931345 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.931327 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:52:41.931395 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.931349 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:52:41.949412 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.949388 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:41.998671 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:41.998619 2574 apiserver.go:52] "Watching apiserver" Apr 17 16:52:42.009338 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.009310 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:52:42.009814 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.009789 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fphrb","openshift-image-registry/node-ca-68lhq","openshift-network-operator/iptables-alerter-qm79k","openshift-ovn-kubernetes/ovnkube-node-9zlb2","kube-system/konnectivity-agent-gpbll","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz","openshift-cluster-node-tuning-operator/tuned-qktd6","openshift-multus/multus-additional-cni-plugins-d47rg","openshift-multus/multus-pk2lp","openshift-multus/network-metrics-daemon-7t66h","openshift-network-diagnostics/network-check-target-pjg74"] Apr 17 16:52:42.014494 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.014464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.015973 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.015905 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:47:41 +0000 UTC" deadline="2028-01-31 03:13:13.40161693 +0000 UTC" Apr 17 16:52:42.015973 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.015947 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15682h20m31.385674961s" Apr 17 16:52:42.016746 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.016589 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.017159 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.017132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.020183 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.017779 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:52:42.020183 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.017977 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-6mljv\"" Apr 17 16:52:42.020183 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.017781 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:52:42.020183 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.019822 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:52:42.020183 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.020135 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:52:42.021068 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.021047 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:52:42.021174 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.021083 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-54nsz\"" Apr 17 16:52:42.021294 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.021276 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:52:42.021357 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.021327 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vj7nk\"" Apr 17 16:52:42.021411 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.021394 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:52:42.021504 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.021487 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:52:42.023594 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.023575 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:52:42.023684 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.023672 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.023864 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.023843 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:52:42.024113 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.024094 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" Apr 17 16:52:42.026022 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026000 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.026143 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026102 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-z9bch\"" Apr 17 16:52:42.026426 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026401 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:52:42.026525 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026429 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:52:42.026525 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:52:42.026692 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026552 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:52:42.026692 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026555 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:52:42.026838 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026747 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:52:42.026900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.026843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:52:42.027090 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.027074 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wq4nq\"" Apr 17 16:52:42.027863 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.027849 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:52:42.028373 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.028356 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.028704 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.028674 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:52:42.028822 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.028681 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-66wr2\"" Apr 17 16:52:42.028932 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.028891 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:52:42.029096 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.029080 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:52:42.030771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.030754 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.031283 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031265 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:52:42.031370 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031270 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gd9q2\"" Apr 17 16:52:42.031370 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031304 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-node-log\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.031370 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031327 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:52:42.031370 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031334 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-log-socket\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.031370 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-cni-netd\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.031619 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22ae10bb-5884-4fb4-9c4a-473df80ffa49-host\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.031619 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031440 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-systemd\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.031619 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.031619 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovnkube-config\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.031619 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031554 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1de234cc-f7b3-4b43-96ec-12c143fb5b33-konnectivity-ca\") pod \"konnectivity-agent-gpbll\" (UID: \"1de234cc-f7b3-4b43-96ec-12c143fb5b33\") " pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:52:42.031619 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-registration-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-etc-selinux\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl77m\" (UniqueName: \"kubernetes.io/projected/f44d3d65-cb34-4e08-aad3-2bfd9398b990-kube-api-access-jl77m\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031708 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj695\" (UniqueName: \"kubernetes.io/projected/fa2701e6-325c-4b24-9bcb-827a9099143e-kube-api-access-bj695\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031732 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tzgr\" (UniqueName: \"kubernetes.io/projected/22ae10bb-5884-4fb4-9c4a-473df80ffa49-kube-api-access-5tzgr\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovnkube-script-lib\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031779 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1de234cc-f7b3-4b43-96ec-12c143fb5b33-agent-certs\") pod \"konnectivity-agent-gpbll\" (UID: \"1de234cc-f7b3-4b43-96ec-12c143fb5b33\") " pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031833 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-sys-fs\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22ae10bb-5884-4fb4-9c4a-473df80ffa49-serviceca\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-systemd-units\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.031909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-run-netns\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031924 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nxfv\" (UniqueName: \"kubernetes.io/projected/a9562614-f113-46e2-95eb-ab53f4ee4f5d-kube-api-access-4nxfv\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031939 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-socket-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-etc-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.031986 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa2701e6-325c-4b24-9bcb-827a9099143e-hosts-file\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032029 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54fb98b4-00f5-45b6-ac65-4246f1f7273d-iptables-alerter-script\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032054 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg8hl\" (UniqueName: \"kubernetes.io/projected/54fb98b4-00f5-45b6-ac65-4246f1f7273d-kube-api-access-vg8hl\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-slash\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-var-lib-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-device-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-cni-bin\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032158 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-env-overrides\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovn-node-metrics-cert\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.032214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2701e6-325c-4b24-9bcb-827a9099143e-tmp-dir\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.032746 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54fb98b4-00f5-45b6-ac65-4246f1f7273d-host-slash\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.032746 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-kubelet\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032746 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032278 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.032746 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032296 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-ovn\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.033039 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.032998 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:52:42.033194 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.033175 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.033286 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.033257 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:52:42.033286 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.033264 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:52:42.033286 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.033275 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:52:42.033601 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.033588 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:52:42.033644 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.033618 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7dgfk\"" Apr 17 16:52:42.035627 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.035609 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:42.035776 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.035709 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:42.036219 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.036201 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:52:42.036279 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.036233 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8kn9t\"" Apr 17 16:52:42.037847 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.037831 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:42.037905 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.037892 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:42.038789 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.038773 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal"] Apr 17 16:52:42.039921 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.039899 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:52:42.040013 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.040000 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" Apr 17 16:52:42.041422 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.041401 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:52:42.048828 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.048805 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal"] Apr 17 16:52:42.048944 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.048833 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:52:42.066979 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.066954 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8pj5k" Apr 17 16:52:42.074884 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.074861 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8pj5k" Apr 17 16:52:42.122252 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.122224 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c4eaf6115bb9effc38fe56b1daa6a65.slice/crio-078f6300aac2303855bb0d7679a00347584dff95f915832edfcc3552b302e257 WatchSource:0}: Error finding container 078f6300aac2303855bb0d7679a00347584dff95f915832edfcc3552b302e257: Status 404 returned error can't find the container with id 078f6300aac2303855bb0d7679a00347584dff95f915832edfcc3552b302e257 Apr 17 16:52:42.122450 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.122428 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1cf25a30bb24672cb7e749f4e426384.slice/crio-be976013df9d90db0714cf3b3dcf3105c9460becc8512134a0c0a47d38f8d5ce WatchSource:0}: Error finding container be976013df9d90db0714cf3b3dcf3105c9460becc8512134a0c0a47d38f8d5ce: Status 404 returned error can't find the container with id be976013df9d90db0714cf3b3dcf3105c9460becc8512134a0c0a47d38f8d5ce Apr 17 16:52:42.125381 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.125360 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:52:42.126993 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.126978 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:52:42.132457 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22ae10bb-5884-4fb4-9c4a-473df80ffa49-host\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.132562 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovnkube-config\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.132562 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132498 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.132562 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-sys-fs\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.132740 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22ae10bb-5884-4fb4-9c4a-473df80ffa49-host\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.132740 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-run\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.132740 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-sys-fs\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.132740 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-lib-modules\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.132740 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-system-cni-dir\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.132740 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-socket-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cnibin\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-cnibin\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-k8s-cni-cncf-io\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-cni-multus\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132862 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-modprobe-d\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-os-release\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132972 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-system-cni-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-socket-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.132998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.132996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.133537 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54fb98b4-00f5-45b6-ac65-4246f1f7273d-iptables-alerter-script\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.133537 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg8hl\" (UniqueName: \"kubernetes.io/projected/54fb98b4-00f5-45b6-ac65-4246f1f7273d-kube-api-access-vg8hl\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.133537 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-slash\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133537 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-device-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.133537 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-slash\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133537 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/edf900ab-93e0-45c2-8735-8125d0377fd3-tmp\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.133537 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133241 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-etc-kubernetes\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.133537 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-device-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-env-overrides\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovn-node-metrics-cert\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54fb98b4-00f5-45b6-ac65-4246f1f7273d-iptables-alerter-script\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-daemon-config\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-var-lib-kubelet\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133766 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovnkube-config\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-kubelet\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-kubelet\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-ovn\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-node-log\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.133900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-cni-netd\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.134503 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133906 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-ovn\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.134503 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133926 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-etc-selinux\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.134503 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133906 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-node-log\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.134503 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl77m\" (UniqueName: \"kubernetes.io/projected/f44d3d65-cb34-4e08-aad3-2bfd9398b990-kube-api-access-jl77m\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.134503 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133956 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-cni-netd\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.134503 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133997 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-etc-selinux\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.134503 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.134049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-env-overrides\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.134503 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.134077 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-systemd\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.134818 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.134765 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:52:42.135089 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.133980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-systemd\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135160 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-run-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135160 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135247 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-registration-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.135247 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135221 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysconfig\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.135336 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135336 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysctl-d\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.135336 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-systemd\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.135495 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-sys\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.135495 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f44d3d65-cb34-4e08-aad3-2bfd9398b990-registration-dir\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.135495 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135387 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.135495 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj695\" (UniqueName: \"kubernetes.io/projected/fa2701e6-325c-4b24-9bcb-827a9099143e-kube-api-access-bj695\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.135495 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tzgr\" (UniqueName: \"kubernetes.io/projected/22ae10bb-5884-4fb4-9c4a-473df80ffa49-kube-api-access-5tzgr\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.135495 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovnkube-script-lib\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1de234cc-f7b3-4b43-96ec-12c143fb5b33-agent-certs\") pod \"konnectivity-agent-gpbll\" (UID: \"1de234cc-f7b3-4b43-96ec-12c143fb5b33\") " pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135528 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjt9\" (UniqueName: \"kubernetes.io/projected/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-kube-api-access-qqjt9\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135554 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-socket-dir-parent\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135586 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22ae10bb-5884-4fb4-9c4a-473df80ffa49-serviceca\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-systemd-units\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-run-netns\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-systemd-units\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nxfv\" (UniqueName: \"kubernetes.io/projected/a9562614-f113-46e2-95eb-ab53f4ee4f5d-kube-api-access-4nxfv\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-cni-bin\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.135799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-hostroot\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-multus-certs\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135844 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djs5m\" (UniqueName: \"kubernetes.io/projected/af693ee1-cbf2-4af0-9b87-70d6ad66e314-kube-api-access-djs5m\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.135921 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-run-netns\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysctl-conf\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-host\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136099 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/22ae10bb-5884-4fb4-9c4a-473df80ffa49-serviceca\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovnkube-script-lib\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136109 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-conf-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-etc-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.136383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-etc-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.137007 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8nhp\" (UniqueName: \"kubernetes.io/projected/edf900ab-93e0-45c2-8735-8125d0377fd3-kube-api-access-t8nhp\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.137007 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa2701e6-325c-4b24-9bcb-827a9099143e-hosts-file\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.137007 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-run-ovn-kubernetes\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.137007 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.136868 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa2701e6-325c-4b24-9bcb-827a9099143e-hosts-file\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.137198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-var-lib-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.137198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137096 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-kubernetes\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.137198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-tuned\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.137198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af693ee1-cbf2-4af0-9b87-70d6ad66e314-cni-binary-copy\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.137198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-var-lib-openvswitch\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.137198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-netns\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.137450 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-cni-bin\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.137450 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137261 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-cni-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.137450 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-host-cni-bin\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.137450 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137282 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-kubelet\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.137450 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137326 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:42.137450 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2701e6-325c-4b24-9bcb-827a9099143e-tmp-dir\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.137450 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54fb98b4-00f5-45b6-ac65-4246f1f7273d-host-slash\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.137450 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-log-socket\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.137825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137479 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54fb98b4-00f5-45b6-ac65-4246f1f7273d-host-slash\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.137825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1de234cc-f7b3-4b43-96ec-12c143fb5b33-konnectivity-ca\") pod \"konnectivity-agent-gpbll\" (UID: \"1de234cc-f7b3-4b43-96ec-12c143fb5b33\") " pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:52:42.137825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz22t\" (UniqueName: \"kubernetes.io/projected/c9f04956-3cc9-4095-a965-b3737339bb37-kube-api-access-hz22t\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:42.137825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-os-release\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.137825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:42.137825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a9562614-f113-46e2-95eb-ab53f4ee4f5d-log-socket\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.137825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.137740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa2701e6-325c-4b24-9bcb-827a9099143e-tmp-dir\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.140680 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.138197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1de234cc-f7b3-4b43-96ec-12c143fb5b33-konnectivity-ca\") pod \"konnectivity-agent-gpbll\" (UID: \"1de234cc-f7b3-4b43-96ec-12c143fb5b33\") " pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:52:42.140680 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.139563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a9562614-f113-46e2-95eb-ab53f4ee4f5d-ovn-node-metrics-cert\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.140680 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.139625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1de234cc-f7b3-4b43-96ec-12c143fb5b33-agent-certs\") pod \"konnectivity-agent-gpbll\" (UID: \"1de234cc-f7b3-4b43-96ec-12c143fb5b33\") " pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:52:42.144947 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.144916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl77m\" (UniqueName: \"kubernetes.io/projected/f44d3d65-cb34-4e08-aad3-2bfd9398b990-kube-api-access-jl77m\") pod \"aws-ebs-csi-driver-node-2pcvz\" (UID: \"f44d3d65-cb34-4e08-aad3-2bfd9398b990\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.145137 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.145091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj695\" (UniqueName: \"kubernetes.io/projected/fa2701e6-325c-4b24-9bcb-827a9099143e-kube-api-access-bj695\") pod \"node-resolver-fphrb\" (UID: \"fa2701e6-325c-4b24-9bcb-827a9099143e\") " pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.145838 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.145815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg8hl\" (UniqueName: \"kubernetes.io/projected/54fb98b4-00f5-45b6-ac65-4246f1f7273d-kube-api-access-vg8hl\") pod \"iptables-alerter-qm79k\" (UID: \"54fb98b4-00f5-45b6-ac65-4246f1f7273d\") " pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.146202 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.146182 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tzgr\" (UniqueName: \"kubernetes.io/projected/22ae10bb-5884-4fb4-9c4a-473df80ffa49-kube-api-access-5tzgr\") pod \"node-ca-68lhq\" (UID: \"22ae10bb-5884-4fb4-9c4a-473df80ffa49\") " pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.146640 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.146620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nxfv\" (UniqueName: \"kubernetes.io/projected/a9562614-f113-46e2-95eb-ab53f4ee4f5d-kube-api-access-4nxfv\") pod \"ovnkube-node-9zlb2\" (UID: \"a9562614-f113-46e2-95eb-ab53f4ee4f5d\") " pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.164546 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.164524 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:42.165383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.165345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" event={"ID":"7c4eaf6115bb9effc38fe56b1daa6a65","Type":"ContainerStarted","Data":"078f6300aac2303855bb0d7679a00347584dff95f915832edfcc3552b302e257"} Apr 17 16:52:42.166268 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.166249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" event={"ID":"f1cf25a30bb24672cb7e749f4e426384","Type":"ContainerStarted","Data":"be976013df9d90db0714cf3b3dcf3105c9460becc8512134a0c0a47d38f8d5ce"} Apr 17 16:52:42.238238 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-netns\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.238238 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-cni-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.238238 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-kubelet\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-netns\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz22t\" (UniqueName: \"kubernetes.io/projected/c9f04956-3cc9-4095-a965-b3737339bb37-kube-api-access-hz22t\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-kubelet\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-os-release\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-cni-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.238366 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-os-release\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238370 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.238445 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs podName:c9f04956-3cc9-4095-a965-b3737339bb37 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:42.738410626 +0000 UTC m=+2.137839146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs") pod "network-metrics-daemon-7t66h" (UID: "c9f04956-3cc9-4095-a965-b3737339bb37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-run\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.238485 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-lib-modules\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-run\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-system-cni-dir\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cnibin\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238612 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-lib-modules\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238613 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-cnibin\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cnibin\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-k8s-cni-cncf-io\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-cni-multus\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-modprobe-d\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-os-release\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-k8s-cni-cncf-io\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-cnibin\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-system-cni-dir\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-system-cni-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-cni-multus\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239067 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238808 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-modprobe-d\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238806 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-os-release\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238848 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/edf900ab-93e0-45c2-8735-8125d0377fd3-tmp\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-system-cni-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238872 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-etc-kubernetes\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-daemon-config\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-var-lib-kubelet\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238937 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cni-binary-copy\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-etc-kubernetes\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysconfig\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.238998 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysctl-d\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-var-lib-kubelet\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-systemd\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-sys\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239047 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysconfig\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.239784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-systemd\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-sys\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjt9\" (UniqueName: \"kubernetes.io/projected/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-kube-api-access-qqjt9\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysctl-d\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-socket-dir-parent\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239163 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-cni-bin\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-hostroot\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-multus-certs\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-socket-dir-parent\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djs5m\" (UniqueName: \"kubernetes.io/projected/af693ee1-cbf2-4af0-9b87-70d6ad66e314-kube-api-access-djs5m\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysctl-conf\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-host\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-conf-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239330 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-hostroot\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-var-lib-cni-bin\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8nhp\" (UniqueName: \"kubernetes.io/projected/edf900ab-93e0-45c2-8735-8125d0377fd3-kube-api-access-t8nhp\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-host-run-multus-certs\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239377 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-kubernetes\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-host\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-conf-dir\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239406 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-tuned\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-sysctl-conf\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af693ee1-cbf2-4af0-9b87-70d6ad66e314-cni-binary-copy\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239472 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/af693ee1-cbf2-4af0-9b87-70d6ad66e314-multus-daemon-config\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-kubernetes\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.240782 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.239870 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/af693ee1-cbf2-4af0-9b87-70d6ad66e314-cni-binary-copy\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.241352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.241332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/edf900ab-93e0-45c2-8735-8125d0377fd3-tmp\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.241421 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.241384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/edf900ab-93e0-45c2-8735-8125d0377fd3-etc-tuned\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.245903 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.245888 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:42.245954 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.245905 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:42.245954 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.245916 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ggrd4 for pod openshift-network-diagnostics/network-check-target-pjg74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:42.246029 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.245963 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4 podName:9ea89042-5289-40b6-8778-7ab4e248e54b nodeName:}" failed. No retries permitted until 2026-04-17 16:52:42.745950916 +0000 UTC m=+2.145379426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ggrd4" (UniqueName: "kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4") pod "network-check-target-pjg74" (UID: "9ea89042-5289-40b6-8778-7ab4e248e54b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:42.247931 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.247911 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjt9\" (UniqueName: \"kubernetes.io/projected/a3d26988-7d2d-401c-a249-4bebb0e0a0d6-kube-api-access-qqjt9\") pod \"multus-additional-cni-plugins-d47rg\" (UID: \"a3d26988-7d2d-401c-a249-4bebb0e0a0d6\") " pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.248522 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.248501 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz22t\" (UniqueName: \"kubernetes.io/projected/c9f04956-3cc9-4095-a965-b3737339bb37-kube-api-access-hz22t\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:42.248700 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.248682 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8nhp\" (UniqueName: \"kubernetes.io/projected/edf900ab-93e0-45c2-8735-8125d0377fd3-kube-api-access-t8nhp\") pod \"tuned-qktd6\" (UID: \"edf900ab-93e0-45c2-8735-8125d0377fd3\") " pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.249423 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.249408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djs5m\" (UniqueName: \"kubernetes.io/projected/af693ee1-cbf2-4af0-9b87-70d6ad66e314-kube-api-access-djs5m\") pod \"multus-pk2lp\" (UID: \"af693ee1-cbf2-4af0-9b87-70d6ad66e314\") " pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.336097 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.336074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fphrb" Apr 17 16:52:42.341956 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.341930 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2701e6_325c_4b24_9bcb_827a9099143e.slice/crio-d23aff554f1d397f37b9499c487be4ef693a2e0e08330a2c04b7433d18231073 WatchSource:0}: Error finding container d23aff554f1d397f37b9499c487be4ef693a2e0e08330a2c04b7433d18231073: Status 404 returned error can't find the container with id d23aff554f1d397f37b9499c487be4ef693a2e0e08330a2c04b7433d18231073 Apr 17 16:52:42.346007 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.345992 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" Apr 17 16:52:42.352154 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.352134 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf44d3d65_cb34_4e08_aad3_2bfd9398b990.slice/crio-256114182b33f11c5e98e34ba59dd88bd26403af87a598dd974b17321ca6c7c1 WatchSource:0}: Error finding container 256114182b33f11c5e98e34ba59dd88bd26403af87a598dd974b17321ca6c7c1: Status 404 returned error can't find the container with id 256114182b33f11c5e98e34ba59dd88bd26403af87a598dd974b17321ca6c7c1 Apr 17 16:52:42.363280 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.363264 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qm79k" Apr 17 16:52:42.368967 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.368948 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54fb98b4_00f5_45b6_ac65_4246f1f7273d.slice/crio-3b603daf498f50b396ca82906e2fed50ea83df3797ced7475ebf159b04b1891e WatchSource:0}: Error finding container 3b603daf498f50b396ca82906e2fed50ea83df3797ced7475ebf159b04b1891e: Status 404 returned error can't find the container with id 3b603daf498f50b396ca82906e2fed50ea83df3797ced7475ebf159b04b1891e Apr 17 16:52:42.373772 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.373753 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-68lhq" Apr 17 16:52:42.380010 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.379996 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ae10bb_5884_4fb4_9c4a_473df80ffa49.slice/crio-5e06d20066a0dacd1e5bc74afc283a29e4d9fa1add84275ff48353a813c6a23d WatchSource:0}: Error finding container 5e06d20066a0dacd1e5bc74afc283a29e4d9fa1add84275ff48353a813c6a23d: Status 404 returned error can't find the container with id 5e06d20066a0dacd1e5bc74afc283a29e4d9fa1add84275ff48353a813c6a23d Apr 17 16:52:42.389967 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.389946 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:52:42.395212 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.395194 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9562614_f113_46e2_95eb_ab53f4ee4f5d.slice/crio-e937ad2d52d2aa582a43ba536e36bd6dae62190b5555a76fa3c14107318ef066 WatchSource:0}: Error finding container e937ad2d52d2aa582a43ba536e36bd6dae62190b5555a76fa3c14107318ef066: Status 404 returned error can't find the container with id e937ad2d52d2aa582a43ba536e36bd6dae62190b5555a76fa3c14107318ef066 Apr 17 16:52:42.406828 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.406810 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:52:42.413096 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.413077 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de234cc_f7b3_4b43_96ec_12c143fb5b33.slice/crio-0a7b251feaa7f4e1d353063194c713bc7b240021c226e4ef4cf7a67808772d15 WatchSource:0}: Error finding container 0a7b251feaa7f4e1d353063194c713bc7b240021c226e4ef4cf7a67808772d15: Status 404 returned error can't find the container with id 0a7b251feaa7f4e1d353063194c713bc7b240021c226e4ef4cf7a67808772d15 Apr 17 16:52:42.429811 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.429770 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qktd6" Apr 17 16:52:42.434918 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.434895 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedf900ab_93e0_45c2_8735_8125d0377fd3.slice/crio-8086a7969ceae91f609b854675f4b6f17be8597342403a2f868d67aac4f0dc69 WatchSource:0}: Error finding container 8086a7969ceae91f609b854675f4b6f17be8597342403a2f868d67aac4f0dc69: Status 404 returned error can't find the container with id 8086a7969ceae91f609b854675f4b6f17be8597342403a2f868d67aac4f0dc69 Apr 17 16:52:42.441599 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.441583 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d47rg" Apr 17 16:52:42.448106 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.448086 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pk2lp" Apr 17 16:52:42.448455 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.448434 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3d26988_7d2d_401c_a249_4bebb0e0a0d6.slice/crio-a924c4995a8d118f1d99bcdec3ec8d6e06da81d323164e84c398862a0839d4d0 WatchSource:0}: Error finding container a924c4995a8d118f1d99bcdec3ec8d6e06da81d323164e84c398862a0839d4d0: Status 404 returned error can't find the container with id a924c4995a8d118f1d99bcdec3ec8d6e06da81d323164e84c398862a0839d4d0 Apr 17 16:52:42.453488 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:52:42.453465 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf693ee1_cbf2_4af0_9b87_70d6ad66e314.slice/crio-f43a142422ebac154dfe71275aa8b4f87e2209d2e7f03a0607285c0adb8cca60 WatchSource:0}: Error finding container f43a142422ebac154dfe71275aa8b4f87e2209d2e7f03a0607285c0adb8cca60: Status 404 returned error can't find the container with id f43a142422ebac154dfe71275aa8b4f87e2209d2e7f03a0607285c0adb8cca60 Apr 17 16:52:42.743432 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.742875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:42.743432 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.743040 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:42.743432 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.743098 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs podName:c9f04956-3cc9-4095-a965-b3737339bb37 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:43.743078954 +0000 UTC m=+3.142507487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs") pod "network-metrics-daemon-7t66h" (UID: "c9f04956-3cc9-4095-a965-b3737339bb37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:42.843608 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.843573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:42.843803 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.843741 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:42.843803 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.843760 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:42.843803 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.843774 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ggrd4 for pod openshift-network-diagnostics/network-check-target-pjg74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:42.843965 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:42.843835 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4 podName:9ea89042-5289-40b6-8778-7ab4e248e54b nodeName:}" failed. No retries permitted until 2026-04-17 16:52:43.843815527 +0000 UTC m=+3.243244058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggrd4" (UniqueName: "kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4") pod "network-check-target-pjg74" (UID: "9ea89042-5289-40b6-8778-7ab4e248e54b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:42.911437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:42.911216 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:43.043227 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.043146 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:52:43.075955 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.075895 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:47:42 +0000 UTC" deadline="2027-11-29 22:35:26.362743136 +0000 UTC" Apr 17 16:52:43.075955 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.075929 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14189h42m43.286817769s" Apr 17 16:52:43.163043 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.163011 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:43.163220 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:43.163144 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:43.193448 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.193400 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d47rg" event={"ID":"a3d26988-7d2d-401c-a249-4bebb0e0a0d6","Type":"ContainerStarted","Data":"a924c4995a8d118f1d99bcdec3ec8d6e06da81d323164e84c398862a0839d4d0"} Apr 17 16:52:43.207620 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.207584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"e937ad2d52d2aa582a43ba536e36bd6dae62190b5555a76fa3c14107318ef066"} Apr 17 16:52:43.226394 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.226342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-68lhq" event={"ID":"22ae10bb-5884-4fb4-9c4a-473df80ffa49","Type":"ContainerStarted","Data":"5e06d20066a0dacd1e5bc74afc283a29e4d9fa1add84275ff48353a813c6a23d"} Apr 17 16:52:43.240897 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.240861 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qm79k" event={"ID":"54fb98b4-00f5-45b6-ac65-4246f1f7273d","Type":"ContainerStarted","Data":"3b603daf498f50b396ca82906e2fed50ea83df3797ced7475ebf159b04b1891e"} Apr 17 16:52:43.248224 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.248157 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fphrb" event={"ID":"fa2701e6-325c-4b24-9bcb-827a9099143e","Type":"ContainerStarted","Data":"d23aff554f1d397f37b9499c487be4ef693a2e0e08330a2c04b7433d18231073"} Apr 17 16:52:43.259346 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.259316 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pk2lp" event={"ID":"af693ee1-cbf2-4af0-9b87-70d6ad66e314","Type":"ContainerStarted","Data":"f43a142422ebac154dfe71275aa8b4f87e2209d2e7f03a0607285c0adb8cca60"} Apr 17 16:52:43.269600 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.269548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qktd6" event={"ID":"edf900ab-93e0-45c2-8735-8125d0377fd3","Type":"ContainerStarted","Data":"8086a7969ceae91f609b854675f4b6f17be8597342403a2f868d67aac4f0dc69"} Apr 17 16:52:43.282830 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.282786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gpbll" event={"ID":"1de234cc-f7b3-4b43-96ec-12c143fb5b33","Type":"ContainerStarted","Data":"0a7b251feaa7f4e1d353063194c713bc7b240021c226e4ef4cf7a67808772d15"} Apr 17 16:52:43.306596 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.306517 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" event={"ID":"f44d3d65-cb34-4e08-aad3-2bfd9398b990","Type":"ContainerStarted","Data":"256114182b33f11c5e98e34ba59dd88bd26403af87a598dd974b17321ca6c7c1"} Apr 17 16:52:43.750698 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.750598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:43.750847 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:43.750790 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:43.750908 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:43.750856 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs podName:c9f04956-3cc9-4095-a965-b3737339bb37 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:45.750836455 +0000 UTC m=+5.150264970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs") pod "network-metrics-daemon-7t66h" (UID: "c9f04956-3cc9-4095-a965-b3737339bb37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:43.851858 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:43.851824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:43.852040 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:43.852016 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:43.852040 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:43.852037 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:43.852144 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:43.852050 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ggrd4 for pod openshift-network-diagnostics/network-check-target-pjg74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:43.852144 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:43.852107 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4 podName:9ea89042-5289-40b6-8778-7ab4e248e54b nodeName:}" failed. No retries permitted until 2026-04-17 16:52:45.852090797 +0000 UTC m=+5.251519313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggrd4" (UniqueName: "kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4") pod "network-check-target-pjg74" (UID: "9ea89042-5289-40b6-8778-7ab4e248e54b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:44.076292 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:44.076202 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:47:42 +0000 UTC" deadline="2027-09-15 04:17:41.969053781 +0000 UTC" Apr 17 16:52:44.076292 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:44.076244 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12371h24m57.892814329s" Apr 17 16:52:44.162191 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:44.162159 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:44.162370 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:44.162290 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:45.163131 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:45.163053 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:45.163546 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:45.163184 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:45.767618 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:45.767584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:45.767866 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:45.767844 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:45.767954 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:45.767920 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs podName:c9f04956-3cc9-4095-a965-b3737339bb37 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:49.767900482 +0000 UTC m=+9.167328997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs") pod "network-metrics-daemon-7t66h" (UID: "c9f04956-3cc9-4095-a965-b3737339bb37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:45.869443 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:45.868821 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:45.869443 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:45.869004 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:45.869443 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:45.869024 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:45.869443 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:45.869037 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ggrd4 for pod openshift-network-diagnostics/network-check-target-pjg74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:45.869443 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:45.869096 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4 podName:9ea89042-5289-40b6-8778-7ab4e248e54b nodeName:}" failed. No retries permitted until 2026-04-17 16:52:49.869078122 +0000 UTC m=+9.268506637 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggrd4" (UniqueName: "kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4") pod "network-check-target-pjg74" (UID: "9ea89042-5289-40b6-8778-7ab4e248e54b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:46.162628 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:46.162515 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:46.162803 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:46.162670 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:47.162997 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:47.162961 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:47.163476 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:47.163084 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:48.163120 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:48.163089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:48.163465 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:48.163211 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:49.164272 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:49.163768 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:49.164272 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:49.163926 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:49.799320 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:49.799266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:49.799483 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:49.799453 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:49.799547 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:49.799518 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs podName:c9f04956-3cc9-4095-a965-b3737339bb37 nodeName:}" failed. No retries permitted until 2026-04-17 16:52:57.799497701 +0000 UTC m=+17.198926217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs") pod "network-metrics-daemon-7t66h" (UID: "c9f04956-3cc9-4095-a965-b3737339bb37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:49.899886 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:49.899824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:49.900072 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:49.899968 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:49.900072 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:49.899989 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:49.900072 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:49.900003 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ggrd4 for pod openshift-network-diagnostics/network-check-target-pjg74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:49.900072 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:49.900053 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4 podName:9ea89042-5289-40b6-8778-7ab4e248e54b nodeName:}" failed. No retries permitted until 2026-04-17 16:52:57.900035214 +0000 UTC m=+17.299463723 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggrd4" (UniqueName: "kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4") pod "network-check-target-pjg74" (UID: "9ea89042-5289-40b6-8778-7ab4e248e54b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:50.163083 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:50.163007 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:50.163233 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:50.163149 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:51.163327 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:51.163290 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:51.163802 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:51.163415 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:52.162516 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:52.162480 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:52.162702 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:52.162579 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:53.162858 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:53.162813 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:53.163292 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:53.162939 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:54.162128 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:54.162101 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:54.162281 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:54.162222 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:55.162426 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:55.162390 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:55.162950 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:55.162523 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:56.163145 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:56.163103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:56.163564 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:56.163231 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:57.162521 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:57.162489 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:57.162710 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:57.162614 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:52:57.858019 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:57.857977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:57.858612 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:57.858150 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:57.858612 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:57.858229 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs podName:c9f04956-3cc9-4095-a965-b3737339bb37 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:13.858209275 +0000 UTC m=+33.257637799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs") pod "network-metrics-daemon-7t66h" (UID: "c9f04956-3cc9-4095-a965-b3737339bb37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:52:57.958625 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:57.958578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:57.958824 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:57.958749 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:52:57.958824 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:57.958769 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:52:57.958824 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:57.958782 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ggrd4 for pod openshift-network-diagnostics/network-check-target-pjg74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:57.958977 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:57.958846 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4 podName:9ea89042-5289-40b6-8778-7ab4e248e54b nodeName:}" failed. No retries permitted until 2026-04-17 16:53:13.958827092 +0000 UTC m=+33.358255603 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggrd4" (UniqueName: "kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4") pod "network-check-target-pjg74" (UID: "9ea89042-5289-40b6-8778-7ab4e248e54b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:52:58.162397 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:58.162311 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:52:58.162544 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:58.162425 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:52:59.163102 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:52:59.163068 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:52:59.163629 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:52:59.163202 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:53:00.162790 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:00.162753 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:00.162971 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:00.162871 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:53:01.163557 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:01.163379 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:01.164070 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:01.163625 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:53:01.345950 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:01.345868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" event={"ID":"7c4eaf6115bb9effc38fe56b1daa6a65","Type":"ContainerStarted","Data":"4c170ea3a3560be02e94eca7607fe37b0c35d69bfa42acdb9a6dd98dfe05908d"} Apr 17 16:53:01.349613 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:01.348941 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pk2lp" event={"ID":"af693ee1-cbf2-4af0-9b87-70d6ad66e314","Type":"ContainerStarted","Data":"023b03a66704dc5f9d4f4dd0109e5e9b34dc235351dc67b8e40e33cb3eedeb43"} Apr 17 16:53:01.351579 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:01.351550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qktd6" event={"ID":"edf900ab-93e0-45c2-8735-8125d0377fd3","Type":"ContainerStarted","Data":"e5199c0d4eb8cd5ed9f4307e6819d7d61a5f48134bdb94dd2ad3f5770177c34e"} Apr 17 16:53:01.361960 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:01.361899 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-47.ec2.internal" podStartSLOduration=19.361886501 podStartE2EDuration="19.361886501s" podCreationTimestamp="2026-04-17 16:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:01.361412735 +0000 UTC m=+20.760841268" watchObservedRunningTime="2026-04-17 16:53:01.361886501 +0000 UTC m=+20.761315033" Apr 17 16:53:01.380291 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:01.380253 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qktd6" podStartSLOduration=2.048280987 podStartE2EDuration="20.380240076s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.436180327 +0000 UTC m=+1.835608837" lastFinishedPulling="2026-04-17 16:53:00.768139412 +0000 UTC m=+20.167567926" observedRunningTime="2026-04-17 16:53:01.380169249 +0000 UTC m=+20.779597784" watchObservedRunningTime="2026-04-17 16:53:01.380240076 +0000 UTC m=+20.779668608" Apr 17 16:53:01.399798 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:01.399758 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pk2lp" podStartSLOduration=1.8950521249999999 podStartE2EDuration="20.399743074s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.454848007 +0000 UTC m=+1.854276520" lastFinishedPulling="2026-04-17 16:53:00.959538956 +0000 UTC m=+20.358967469" observedRunningTime="2026-04-17 16:53:01.399439852 +0000 UTC m=+20.798868381" watchObservedRunningTime="2026-04-17 16:53:01.399743074 +0000 UTC m=+20.799171606" Apr 17 16:53:02.163074 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.162766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:02.163244 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:02.163146 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:53:02.354973 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.354935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gpbll" event={"ID":"1de234cc-f7b3-4b43-96ec-12c143fb5b33","Type":"ContainerStarted","Data":"52a810a1346f98ac6e7166e177a2ef8a8c41e0394041ccd22f4b18ba08d3c0ce"} Apr 17 16:53:02.356266 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.356242 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" event={"ID":"f44d3d65-cb34-4e08-aad3-2bfd9398b990","Type":"ContainerStarted","Data":"a96d6196978e26eda79838b0db3329552d576ffac8a0ca5aefa46094f6c48561"} Apr 17 16:53:02.357473 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.357455 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3d26988-7d2d-401c-a249-4bebb0e0a0d6" containerID="9fc4f6fcdf50a512a95151af9d8281a05d1989508c729bc51bb3c139e7e07206" exitCode=0 Apr 17 16:53:02.357557 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.357523 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d47rg" event={"ID":"a3d26988-7d2d-401c-a249-4bebb0e0a0d6","Type":"ContainerDied","Data":"9fc4f6fcdf50a512a95151af9d8281a05d1989508c729bc51bb3c139e7e07206"} Apr 17 16:53:02.360547 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.360529 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"994a6a4a0efa2b642e3d2d0077657378a54bf4db69ef1ebbce88f51601d69fd6"} Apr 17 16:53:02.360609 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.360552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"2493a7d8e87905c5ed86dfd9481c8c7732f725e3730bb51a7cc378e7d6e9bf09"} Apr 17 16:53:02.360609 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.360564 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"85b291e988b784336df21878d58c91d591ddc73ef9f394f20e343713bca3dcd1"} Apr 17 16:53:02.360609 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.360577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"0bb90691a303b34bb997212a21854a9e4313b0e7753feee78eef64292b0e57fd"} Apr 17 16:53:02.360609 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.360588 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"8b9d0384b78088d2388ad75482583fefc1f98df2821b201736ff9ea6844f7a06"} Apr 17 16:53:02.360609 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.360606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"f41d7f2308cffbb089915cbfed729b509ec24dd3f42efc302e87d1ed007baad8"} Apr 17 16:53:02.361855 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.361829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-68lhq" event={"ID":"22ae10bb-5884-4fb4-9c4a-473df80ffa49","Type":"ContainerStarted","Data":"d5a89575ae94ad69a4ea8d47f31d3de0177ec9c05ee798e6edc0f23385c54d7e"} Apr 17 16:53:02.364958 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.363640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qm79k" event={"ID":"54fb98b4-00f5-45b6-ac65-4246f1f7273d","Type":"ContainerStarted","Data":"166e5b44b06d914aef3154f10b24729cfdde15b6bb16bc20c634c0b4935f95ab"} Apr 17 16:53:02.366263 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.366246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fphrb" event={"ID":"fa2701e6-325c-4b24-9bcb-827a9099143e","Type":"ContainerStarted","Data":"77e558bfc359af5b185d9e206f444f34bb9351ff3b7e89b095ac22a04b24725f"} Apr 17 16:53:02.367788 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.367770 2574 generic.go:358] "Generic (PLEG): container finished" podID="f1cf25a30bb24672cb7e749f4e426384" containerID="3e984ffd53f2328c7bcbaf1273b35993ad76f33b6dbed0ec31c2448295ac0649" exitCode=0 Apr 17 16:53:02.367917 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.367864 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" event={"ID":"f1cf25a30bb24672cb7e749f4e426384","Type":"ContainerDied","Data":"3e984ffd53f2328c7bcbaf1273b35993ad76f33b6dbed0ec31c2448295ac0649"} Apr 17 16:53:02.372698 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.372647 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gpbll" podStartSLOduration=3.039016194 podStartE2EDuration="21.372636182s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.414310477 +0000 UTC m=+1.813738989" lastFinishedPulling="2026-04-17 16:53:00.747930456 +0000 UTC m=+20.147358977" observedRunningTime="2026-04-17 16:53:02.372387579 +0000 UTC m=+21.771816112" watchObservedRunningTime="2026-04-17 16:53:02.372636182 +0000 UTC m=+21.772064713" Apr 17 16:53:02.405982 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.405937 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qm79k" podStartSLOduration=3.028367984 podStartE2EDuration="21.405924419s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.370391566 +0000 UTC m=+1.769820075" lastFinishedPulling="2026-04-17 16:53:00.747947996 +0000 UTC m=+20.147376510" observedRunningTime="2026-04-17 16:53:02.405772888 +0000 UTC m=+21.805201426" watchObservedRunningTime="2026-04-17 16:53:02.405924419 +0000 UTC m=+21.805352951" Apr 17 16:53:02.444280 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.443564 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fphrb" podStartSLOduration=3.013566967 podStartE2EDuration="21.443548566s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.34356689 +0000 UTC m=+1.742995401" lastFinishedPulling="2026-04-17 16:53:00.773548487 +0000 UTC m=+20.172977000" observedRunningTime="2026-04-17 16:53:02.443531446 +0000 UTC m=+21.842959979" watchObservedRunningTime="2026-04-17 16:53:02.443548566 +0000 UTC m=+21.842977097" Apr 17 16:53:02.459795 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.459735 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-68lhq" podStartSLOduration=3.09309407 podStartE2EDuration="21.45964049s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.381498347 +0000 UTC m=+1.780926857" lastFinishedPulling="2026-04-17 16:53:00.748044758 +0000 UTC m=+20.147473277" observedRunningTime="2026-04-17 16:53:02.459302166 +0000 UTC m=+21.858730698" watchObservedRunningTime="2026-04-17 16:53:02.45964049 +0000 UTC m=+21.859069021" Apr 17 16:53:02.958845 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:02.958821 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:53:03.099976 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:03.099748 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:53:02.958844631Z","UUID":"88a5d523-113d-4c75-8692-304c51b707a2","Handler":null,"Name":"","Endpoint":""} Apr 17 16:53:03.101541 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:03.101518 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:53:03.101694 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:03.101551 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:53:03.163263 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:03.163239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:03.163419 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:03.163340 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:53:03.372733 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:03.372620 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" event={"ID":"f1cf25a30bb24672cb7e749f4e426384","Type":"ContainerStarted","Data":"3f4e7b55af6b6da6a4b84e08e2a670e68da73c9f8a9bf43d67f28849559fcb0a"} Apr 17 16:53:03.376021 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:03.375992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" event={"ID":"f44d3d65-cb34-4e08-aad3-2bfd9398b990","Type":"ContainerStarted","Data":"fc849185defa980872c3a19937b43aec50aedbfd09cf2517b0314f581fb96910"} Apr 17 16:53:03.390134 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:03.390085 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-47.ec2.internal" podStartSLOduration=21.390067879 podStartE2EDuration="21.390067879s" podCreationTimestamp="2026-04-17 16:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:03.389024556 +0000 UTC m=+22.788453089" watchObservedRunningTime="2026-04-17 16:53:03.390067879 +0000 UTC m=+22.789496413" Apr 17 16:53:04.162243 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:04.162166 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:04.162393 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:04.162274 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:53:04.379719 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:04.379678 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" event={"ID":"f44d3d65-cb34-4e08-aad3-2bfd9398b990","Type":"ContainerStarted","Data":"af02d5f1fd1e8178167b4f18c86d86d637046d1f1a9031b061290f3d61d1bbb7"} Apr 17 16:53:04.382837 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:04.382800 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"8a4b73e1c0e39aa4c67ff7bdf2df3022cbcfc1a440e0a04489a3e05891056bce"} Apr 17 16:53:04.398040 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:04.397997 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2pcvz" podStartSLOduration=1.9737238910000001 podStartE2EDuration="23.397986114s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.353777756 +0000 UTC m=+1.753206266" lastFinishedPulling="2026-04-17 16:53:03.778039964 +0000 UTC m=+23.177468489" observedRunningTime="2026-04-17 16:53:04.397811823 +0000 UTC m=+23.797240378" watchObservedRunningTime="2026-04-17 16:53:04.397986114 +0000 UTC m=+23.797414661" Apr 17 16:53:05.162882 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:05.162600 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:05.163037 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:05.162907 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:53:06.162816 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:06.162783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:06.163256 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:06.162916 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:53:06.667277 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:06.667252 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:53:06.667950 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:06.667934 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:53:06.675932 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:06.675822 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:53:06.676124 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:06.676112 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gpbll" Apr 17 16:53:07.165164 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.165141 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:07.165435 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:07.165241 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:53:07.389438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.389264 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3d26988-7d2d-401c-a249-4bebb0e0a0d6" containerID="10b0e04d1135d7f9cb5ec7ae645f1276541705b86f1d279597f9628ecdf69b15" exitCode=0 Apr 17 16:53:07.389602 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.389352 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d47rg" event={"ID":"a3d26988-7d2d-401c-a249-4bebb0e0a0d6","Type":"ContainerDied","Data":"10b0e04d1135d7f9cb5ec7ae645f1276541705b86f1d279597f9628ecdf69b15"} Apr 17 16:53:07.395354 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.395319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" event={"ID":"a9562614-f113-46e2-95eb-ab53f4ee4f5d","Type":"ContainerStarted","Data":"40f87446ad6105fbb1c1467f400c41b93107911d2329ac860f3797cff5f474a8"} Apr 17 16:53:07.395619 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.395601 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:53:07.395759 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.395743 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:53:07.410129 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.410109 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:53:07.410510 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.410494 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:53:07.441582 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.441535 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" podStartSLOduration=7.460547183 podStartE2EDuration="26.441521269s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.396578364 +0000 UTC m=+1.796006878" lastFinishedPulling="2026-04-17 16:53:01.37755244 +0000 UTC m=+20.776980964" observedRunningTime="2026-04-17 16:53:07.440878744 +0000 UTC m=+26.840307317" watchObservedRunningTime="2026-04-17 16:53:07.441521269 +0000 UTC m=+26.840949800" Apr 17 16:53:07.460459 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:07.460423 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:53:08.162495 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:08.162464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:08.162755 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:08.162590 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:53:08.399110 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:08.399073 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3d26988-7d2d-401c-a249-4bebb0e0a0d6" containerID="67c3818e63e05db4452118a4b102b8bb510bdadea0b85d0997abb086b93cb31b" exitCode=0 Apr 17 16:53:08.399555 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:08.399115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d47rg" event={"ID":"a3d26988-7d2d-401c-a249-4bebb0e0a0d6","Type":"ContainerDied","Data":"67c3818e63e05db4452118a4b102b8bb510bdadea0b85d0997abb086b93cb31b"} Apr 17 16:53:08.705048 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:08.705021 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7t66h"] Apr 17 16:53:08.705183 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:08.705138 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:08.705297 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:08.705247 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:53:08.707798 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:08.707776 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pjg74"] Apr 17 16:53:08.707875 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:08.707864 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:08.707978 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:08.707942 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:53:09.405291 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:09.405195 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3d26988-7d2d-401c-a249-4bebb0e0a0d6" containerID="e763c59999b96a416a09d2d448b99f9d197e531de7cf603644fa6218dae089a3" exitCode=0 Apr 17 16:53:09.405291 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:09.405246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d47rg" event={"ID":"a3d26988-7d2d-401c-a249-4bebb0e0a0d6","Type":"ContainerDied","Data":"e763c59999b96a416a09d2d448b99f9d197e531de7cf603644fa6218dae089a3"} Apr 17 16:53:10.162234 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:10.162207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:10.162393 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:10.162330 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:53:10.162578 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:10.162207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:10.162713 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:10.162687 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:53:12.162738 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:12.162464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:12.163159 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:12.162489 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:12.163159 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:12.162776 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pjg74" podUID="9ea89042-5289-40b6-8778-7ab4e248e54b" Apr 17 16:53:12.163159 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:12.162899 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7t66h" podUID="c9f04956-3cc9-4095-a965-b3737339bb37" Apr 17 16:53:13.898031 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:13.897984 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:13.898419 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:13.898124 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:53:13.898419 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:13.898174 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs podName:c9f04956-3cc9-4095-a965-b3737339bb37 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:45.898160308 +0000 UTC m=+65.297588822 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs") pod "network-metrics-daemon-7t66h" (UID: "c9f04956-3cc9-4095-a965-b3737339bb37") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:53:13.966221 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:13.966188 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-47.ec2.internal" event="NodeReady" Apr 17 16:53:13.966382 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:13.966357 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:53:13.998624 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:13.998585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:13.998804 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:13.998783 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:53:13.998866 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:13.998807 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:53:13.998866 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:13.998821 2574 projected.go:194] Error preparing data for projected volume kube-api-access-ggrd4 for pod openshift-network-diagnostics/network-check-target-pjg74: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:53:13.998952 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:13.998877 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4 podName:9ea89042-5289-40b6-8778-7ab4e248e54b nodeName:}" failed. No retries permitted until 2026-04-17 16:53:45.998861676 +0000 UTC m=+65.398290186 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggrd4" (UniqueName: "kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4") pod "network-check-target-pjg74" (UID: "9ea89042-5289-40b6-8778-7ab4e248e54b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:53:14.013223 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.013078 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5zhl2"] Apr 17 16:53:14.042933 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.042860 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-66g28"] Apr 17 16:53:14.043091 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.042959 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.045931 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.045906 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:53:14.046096 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.046067 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbmc\"" Apr 17 16:53:14.046230 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.046070 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:53:14.058004 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.057925 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5zhl2"] Apr 17 16:53:14.058127 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.058022 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-66g28"] Apr 17 16:53:14.058127 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.058066 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:14.063034 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.063017 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bkp6b\"" Apr 17 16:53:14.063244 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.063229 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:53:14.063434 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.063419 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:53:14.063609 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.063589 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:53:14.162978 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.162946 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:14.163163 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.162946 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:14.165948 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.165929 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:53:14.166406 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.166382 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:53:14.166502 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.166436 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bkwcz\"" Apr 17 16:53:14.166558 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.166382 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:53:14.166819 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.166801 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wm7r2\"" Apr 17 16:53:14.199917 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.199879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxkmv\" (UniqueName: \"kubernetes.io/projected/b75234e4-2fea-42eb-9534-36f51bab38ff-kube-api-access-wxkmv\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.200032 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.199937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5kpc\" (UniqueName: \"kubernetes.io/projected/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-kube-api-access-c5kpc\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:14.200032 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.199965 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.200032 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.199995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b75234e4-2fea-42eb-9534-36f51bab38ff-tmp-dir\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.200172 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.200088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:14.200172 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.200116 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b75234e4-2fea-42eb-9534-36f51bab38ff-config-volume\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.301222 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.301171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:14.301222 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.301227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b75234e4-2fea-42eb-9534-36f51bab38ff-config-volume\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.301477 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.301307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxkmv\" (UniqueName: \"kubernetes.io/projected/b75234e4-2fea-42eb-9534-36f51bab38ff-kube-api-access-wxkmv\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.301477 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:14.301331 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:14.301477 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.301340 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5kpc\" (UniqueName: \"kubernetes.io/projected/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-kube-api-access-c5kpc\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:14.301477 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.301367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.301477 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:14.301400 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert podName:4aec4098-9bf2-45c1-beb6-dcd84bb6ca17 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:14.801380229 +0000 UTC m=+34.200808750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert") pod "ingress-canary-66g28" (UID: "4aec4098-9bf2-45c1-beb6-dcd84bb6ca17") : secret "canary-serving-cert" not found Apr 17 16:53:14.301477 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.301424 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b75234e4-2fea-42eb-9534-36f51bab38ff-tmp-dir\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.301477 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:14.301459 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:14.301808 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:14.301526 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls podName:b75234e4-2fea-42eb-9534-36f51bab38ff nodeName:}" failed. No retries permitted until 2026-04-17 16:53:14.801511387 +0000 UTC m=+34.200939901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls") pod "dns-default-5zhl2" (UID: "b75234e4-2fea-42eb-9534-36f51bab38ff") : secret "dns-default-metrics-tls" not found Apr 17 16:53:14.301808 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.301742 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b75234e4-2fea-42eb-9534-36f51bab38ff-tmp-dir\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.301965 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.301947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b75234e4-2fea-42eb-9534-36f51bab38ff-config-volume\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.313076 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.313052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxkmv\" (UniqueName: \"kubernetes.io/projected/b75234e4-2fea-42eb-9534-36f51bab38ff-kube-api-access-wxkmv\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.313196 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.313145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5kpc\" (UniqueName: \"kubernetes.io/projected/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-kube-api-access-c5kpc\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:14.805897 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.805851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:14.806169 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:14.805923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:14.810511 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:14.806831 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:14.810511 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:14.806876 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:14.810511 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:14.806924 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls podName:b75234e4-2fea-42eb-9534-36f51bab38ff nodeName:}" failed. No retries permitted until 2026-04-17 16:53:15.806892863 +0000 UTC m=+35.206321379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls") pod "dns-default-5zhl2" (UID: "b75234e4-2fea-42eb-9534-36f51bab38ff") : secret "dns-default-metrics-tls" not found Apr 17 16:53:14.810511 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:14.806950 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert podName:4aec4098-9bf2-45c1-beb6-dcd84bb6ca17 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:15.806932254 +0000 UTC m=+35.206360781 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert") pod "ingress-canary-66g28" (UID: "4aec4098-9bf2-45c1-beb6-dcd84bb6ca17") : secret "canary-serving-cert" not found Apr 17 16:53:15.814828 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:15.814787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:15.815310 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:15.814854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:15.815310 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:15.814945 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:15.815310 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:15.814955 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:15.815310 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:15.815015 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls podName:b75234e4-2fea-42eb-9534-36f51bab38ff nodeName:}" failed. No retries permitted until 2026-04-17 16:53:17.814998123 +0000 UTC m=+37.214426652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls") pod "dns-default-5zhl2" (UID: "b75234e4-2fea-42eb-9534-36f51bab38ff") : secret "dns-default-metrics-tls" not found Apr 17 16:53:15.815310 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:15.815030 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert podName:4aec4098-9bf2-45c1-beb6-dcd84bb6ca17 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:17.815024761 +0000 UTC m=+37.214453271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert") pod "ingress-canary-66g28" (UID: "4aec4098-9bf2-45c1-beb6-dcd84bb6ca17") : secret "canary-serving-cert" not found Apr 17 16:53:16.422065 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:16.422030 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3d26988-7d2d-401c-a249-4bebb0e0a0d6" containerID="1fb27be39351993a4883f639aa6bfdb68db2b404edd2973340cf14929d8b0700" exitCode=0 Apr 17 16:53:16.422224 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:16.422084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d47rg" event={"ID":"a3d26988-7d2d-401c-a249-4bebb0e0a0d6","Type":"ContainerDied","Data":"1fb27be39351993a4883f639aa6bfdb68db2b404edd2973340cf14929d8b0700"} Apr 17 16:53:17.426759 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:17.426727 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3d26988-7d2d-401c-a249-4bebb0e0a0d6" containerID="16561168e8790033e3970d445eb0c19f67e78f422de081223da91051604ee136" exitCode=0 Apr 17 16:53:17.427093 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:17.426776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d47rg" event={"ID":"a3d26988-7d2d-401c-a249-4bebb0e0a0d6","Type":"ContainerDied","Data":"16561168e8790033e3970d445eb0c19f67e78f422de081223da91051604ee136"} Apr 17 16:53:17.828763 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:17.828558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:17.828949 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:17.828801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:17.828949 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:17.828723 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:17.828949 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:17.828892 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:17.828949 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:17.828906 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert podName:4aec4098-9bf2-45c1-beb6-dcd84bb6ca17 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:21.828886011 +0000 UTC m=+41.228314533 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert") pod "ingress-canary-66g28" (UID: "4aec4098-9bf2-45c1-beb6-dcd84bb6ca17") : secret "canary-serving-cert" not found Apr 17 16:53:17.828949 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:17.828928 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls podName:b75234e4-2fea-42eb-9534-36f51bab38ff nodeName:}" failed. No retries permitted until 2026-04-17 16:53:21.82891763 +0000 UTC m=+41.228346140 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls") pod "dns-default-5zhl2" (UID: "b75234e4-2fea-42eb-9534-36f51bab38ff") : secret "dns-default-metrics-tls" not found Apr 17 16:53:18.431995 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:18.431960 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d47rg" event={"ID":"a3d26988-7d2d-401c-a249-4bebb0e0a0d6","Type":"ContainerStarted","Data":"96c356121086839ea03e64256eabff8253d5eecf2cae14f512e3515d7f009531"} Apr 17 16:53:18.464577 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:18.464526 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d47rg" podStartSLOduration=4.510980688 podStartE2EDuration="37.464511708s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:52:42.450609139 +0000 UTC m=+1.850037653" lastFinishedPulling="2026-04-17 16:53:15.404140149 +0000 UTC m=+34.803568673" observedRunningTime="2026-04-17 16:53:18.463360957 +0000 UTC m=+37.862789489" watchObservedRunningTime="2026-04-17 16:53:18.464511708 +0000 UTC m=+37.863940239" Apr 17 16:53:20.452465 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.452430 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h"] Apr 17 16:53:20.490822 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.490793 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4"] Apr 17 16:53:20.490966 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.490941 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h" Apr 17 16:53:20.493963 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.493940 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-jvh2s\"" Apr 17 16:53:20.517705 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.517679 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ld9dq"] Apr 17 16:53:20.517795 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.517759 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4" Apr 17 16:53:20.520245 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.520224 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:53:20.520352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.520243 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-d9zq7\"" Apr 17 16:53:20.520560 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.520547 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 16:53:20.530555 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.530539 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h"] Apr 17 16:53:20.530555 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.530558 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4"] Apr 17 16:53:20.530685 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.530568 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ld9dq"] Apr 17 16:53:20.530685 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.530645 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.533699 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.533679 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:53:20.533818 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.533754 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 16:53:20.533898 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.533818 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 16:53:20.533898 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.533824 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-x664w\"" Apr 17 16:53:20.534199 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.534182 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 16:53:20.547063 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.547046 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 16:53:20.573437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.573416 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-55c74c7f8d-k5d4w"] Apr 17 16:53:20.598634 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.598613 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-55c74c7f8d-k5d4w"] Apr 17 16:53:20.598759 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.598730 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.601460 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.601442 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 16:53:20.602113 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.602098 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2w2sj\"" Apr 17 16:53:20.602781 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.602550 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:53:20.603374 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.602903 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 16:53:20.603374 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.602943 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:53:20.603374 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.603027 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 16:53:20.603374 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.603245 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 16:53:20.647883 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.647861 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qk46\" (UniqueName: \"kubernetes.io/projected/ae8bba54-68da-4af3-803f-85e7bd8a4b87-kube-api-access-4qk46\") pod \"network-check-source-8894fc9bd-4tw8h\" (UID: \"ae8bba54-68da-4af3-803f-85e7bd8a4b87\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h" Apr 17 16:53:20.648019 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.647920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5247795a-9811-4fad-b182-136cc56544fd-config\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.648019 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.647947 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5247795a-9811-4fad-b182-136cc56544fd-serving-cert\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.648019 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.647965 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtwwq\" (UniqueName: \"kubernetes.io/projected/6975d595-5cf0-46b8-9850-ebe9f9ad039f-kube-api-access-dtwwq\") pod \"volume-data-source-validator-7c6cbb6c87-d8lj4\" (UID: \"6975d595-5cf0-46b8-9850-ebe9f9ad039f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4" Apr 17 16:53:20.648019 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.647981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5247795a-9811-4fad-b182-136cc56544fd-trusted-ca\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.648019 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.648003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5vzv\" (UniqueName: \"kubernetes.io/projected/5247795a-9811-4fad-b182-136cc56544fd-kube-api-access-f5vzv\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.673820 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.673799 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk"] Apr 17 16:53:20.705865 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.705795 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk"] Apr 17 16:53:20.705952 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.705899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.708670 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.708630 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-szxpw\"" Apr 17 16:53:20.708670 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.708639 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 16:53:20.708670 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.708630 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 16:53:20.708856 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.708687 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 16:53:20.708856 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.708691 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:53:20.749301 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5247795a-9811-4fad-b182-136cc56544fd-config\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.749437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5247795a-9811-4fad-b182-136cc56544fd-serving-cert\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.749437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtwwq\" (UniqueName: \"kubernetes.io/projected/6975d595-5cf0-46b8-9850-ebe9f9ad039f-kube-api-access-dtwwq\") pod \"volume-data-source-validator-7c6cbb6c87-d8lj4\" (UID: \"6975d595-5cf0-46b8-9850-ebe9f9ad039f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4" Apr 17 16:53:20.749437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749347 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5247795a-9811-4fad-b182-136cc56544fd-trusted-ca\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.749437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749371 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.749437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.749437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5vzv\" (UniqueName: \"kubernetes.io/projected/5247795a-9811-4fad-b182-136cc56544fd-kube-api-access-f5vzv\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.749437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-default-certificate\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.749794 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qk46\" (UniqueName: \"kubernetes.io/projected/ae8bba54-68da-4af3-803f-85e7bd8a4b87-kube-api-access-4qk46\") pod \"network-check-source-8894fc9bd-4tw8h\" (UID: \"ae8bba54-68da-4af3-803f-85e7bd8a4b87\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h" Apr 17 16:53:20.749794 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749501 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hffzv\" (UniqueName: \"kubernetes.io/projected/1224934d-3953-4719-a6a4-8ca929c1d869-kube-api-access-hffzv\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.749794 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.749538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-stats-auth\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.750132 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.750110 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5247795a-9811-4fad-b182-136cc56544fd-config\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.750256 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.750241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5247795a-9811-4fad-b182-136cc56544fd-trusted-ca\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.752883 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.752864 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5247795a-9811-4fad-b182-136cc56544fd-serving-cert\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.758025 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.758001 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qk46\" (UniqueName: \"kubernetes.io/projected/ae8bba54-68da-4af3-803f-85e7bd8a4b87-kube-api-access-4qk46\") pod \"network-check-source-8894fc9bd-4tw8h\" (UID: \"ae8bba54-68da-4af3-803f-85e7bd8a4b87\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h" Apr 17 16:53:20.758152 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.758133 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtwwq\" (UniqueName: \"kubernetes.io/projected/6975d595-5cf0-46b8-9850-ebe9f9ad039f-kube-api-access-dtwwq\") pod \"volume-data-source-validator-7c6cbb6c87-d8lj4\" (UID: \"6975d595-5cf0-46b8-9850-ebe9f9ad039f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4" Apr 17 16:53:20.758220 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.758149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5vzv\" (UniqueName: \"kubernetes.io/projected/5247795a-9811-4fad-b182-136cc56544fd-kube-api-access-f5vzv\") pod \"console-operator-9d4b6777b-ld9dq\" (UID: \"5247795a-9811-4fad-b182-136cc56544fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.799236 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.799198 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h" Apr 17 16:53:20.825995 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.825963 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4" Apr 17 16:53:20.838770 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.838745 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:20.850593 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.850565 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.850593 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.850597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.850787 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.850618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-default-certificate\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.850787 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.850676 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crkms\" (UniqueName: \"kubernetes.io/projected/d8f8e345-311f-4173-a487-5da18bb5d557-kube-api-access-crkms\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.850787 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.850710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hffzv\" (UniqueName: \"kubernetes.io/projected/1224934d-3953-4719-a6a4-8ca929c1d869-kube-api-access-hffzv\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.850787 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.850726 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f8e345-311f-4173-a487-5da18bb5d557-config\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.850787 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.850752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-stats-auth\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.850787 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.850769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8f8e345-311f-4173-a487-5da18bb5d557-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.851106 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:20.850930 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:21.350912497 +0000 UTC m=+40.750341021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : configmap references non-existent config key: service-ca.crt Apr 17 16:53:20.851106 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:20.850994 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:53:20.851106 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:20.851023 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:21.351015234 +0000 UTC m=+40.750443743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : secret "router-metrics-certs-default" not found Apr 17 16:53:20.853914 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.853892 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-stats-auth\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.854017 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.853933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-default-certificate\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.862707 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.862673 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hffzv\" (UniqueName: \"kubernetes.io/projected/1224934d-3953-4719-a6a4-8ca929c1d869-kube-api-access-hffzv\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:20.951231 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.951195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crkms\" (UniqueName: \"kubernetes.io/projected/d8f8e345-311f-4173-a487-5da18bb5d557-kube-api-access-crkms\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.951353 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.951272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f8e345-311f-4173-a487-5da18bb5d557-config\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.951353 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.951322 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8f8e345-311f-4173-a487-5da18bb5d557-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.951901 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.951838 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8f8e345-311f-4173-a487-5da18bb5d557-config\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.953863 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.953842 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8f8e345-311f-4173-a487-5da18bb5d557-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.958522 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.958439 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h"] Apr 17 16:53:20.962437 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.962414 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crkms\" (UniqueName: \"kubernetes.io/projected/d8f8e345-311f-4173-a487-5da18bb5d557-kube-api-access-crkms\") pod \"service-ca-operator-d6fc45fc5-dxzpk\" (UID: \"d8f8e345-311f-4173-a487-5da18bb5d557\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:20.964924 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:20.964901 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8bba54_68da_4af3_803f_85e7bd8a4b87.slice/crio-c0b8775973c06c3247406e1879f3540444e29ef47c03fabbac5576d81ed1d3e1 WatchSource:0}: Error finding container c0b8775973c06c3247406e1879f3540444e29ef47c03fabbac5576d81ed1d3e1: Status 404 returned error can't find the container with id c0b8775973c06c3247406e1879f3540444e29ef47c03fabbac5576d81ed1d3e1 Apr 17 16:53:20.970912 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.970892 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4"] Apr 17 16:53:20.973915 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:20.973891 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6975d595_5cf0_46b8_9850_ebe9f9ad039f.slice/crio-2764b2a38d9c0dd9a96273a125a4ac0ae8d4118eafa380d8fce34028528007d8 WatchSource:0}: Error finding container 2764b2a38d9c0dd9a96273a125a4ac0ae8d4118eafa380d8fce34028528007d8: Status 404 returned error can't find the container with id 2764b2a38d9c0dd9a96273a125a4ac0ae8d4118eafa380d8fce34028528007d8 Apr 17 16:53:20.990377 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:20.990343 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ld9dq"] Apr 17 16:53:20.993008 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:20.992981 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5247795a_9811_4fad_b182_136cc56544fd.slice/crio-6442891ac50dfb1311c59fee728c34f77f8f61026eab8e122f4ebafbf8684d96 WatchSource:0}: Error finding container 6442891ac50dfb1311c59fee728c34f77f8f61026eab8e122f4ebafbf8684d96: Status 404 returned error can't find the container with id 6442891ac50dfb1311c59fee728c34f77f8f61026eab8e122f4ebafbf8684d96 Apr 17 16:53:21.014124 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.014100 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" Apr 17 16:53:21.143907 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.143881 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk"] Apr 17 16:53:21.146999 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:21.146975 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f8e345_311f_4173_a487_5da18bb5d557.slice/crio-49afdb2699cff2f9d5fd4f060dcd9fa35806081dc3318f9aa311774448639fb4 WatchSource:0}: Error finding container 49afdb2699cff2f9d5fd4f060dcd9fa35806081dc3318f9aa311774448639fb4: Status 404 returned error can't find the container with id 49afdb2699cff2f9d5fd4f060dcd9fa35806081dc3318f9aa311774448639fb4 Apr 17 16:53:21.354415 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.354385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:21.354415 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.354418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:21.354846 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:21.354822 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:22.354797329 +0000 UTC m=+41.754225855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : configmap references non-existent config key: service-ca.crt Apr 17 16:53:21.354976 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:21.354847 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:53:21.354976 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:21.354945 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:22.354902521 +0000 UTC m=+41.754331031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : secret "router-metrics-certs-default" not found Apr 17 16:53:21.438526 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.438487 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" event={"ID":"d8f8e345-311f-4173-a487-5da18bb5d557","Type":"ContainerStarted","Data":"49afdb2699cff2f9d5fd4f060dcd9fa35806081dc3318f9aa311774448639fb4"} Apr 17 16:53:21.439638 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.439614 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4" event={"ID":"6975d595-5cf0-46b8-9850-ebe9f9ad039f","Type":"ContainerStarted","Data":"2764b2a38d9c0dd9a96273a125a4ac0ae8d4118eafa380d8fce34028528007d8"} Apr 17 16:53:21.440713 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.440682 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h" event={"ID":"ae8bba54-68da-4af3-803f-85e7bd8a4b87","Type":"ContainerStarted","Data":"c0b8775973c06c3247406e1879f3540444e29ef47c03fabbac5576d81ed1d3e1"} Apr 17 16:53:21.441771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.441750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" event={"ID":"5247795a-9811-4fad-b182-136cc56544fd","Type":"ContainerStarted","Data":"6442891ac50dfb1311c59fee728c34f77f8f61026eab8e122f4ebafbf8684d96"} Apr 17 16:53:21.860090 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.859312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:21.860090 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:21.859385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:21.860090 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:21.859531 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:21.860090 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:21.859587 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert podName:4aec4098-9bf2-45c1-beb6-dcd84bb6ca17 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:29.859567134 +0000 UTC m=+49.258995650 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert") pod "ingress-canary-66g28" (UID: "4aec4098-9bf2-45c1-beb6-dcd84bb6ca17") : secret "canary-serving-cert" not found Apr 17 16:53:21.860090 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:21.860013 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:21.860090 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:21.860057 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls podName:b75234e4-2fea-42eb-9534-36f51bab38ff nodeName:}" failed. No retries permitted until 2026-04-17 16:53:29.86004199 +0000 UTC m=+49.259470504 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls") pod "dns-default-5zhl2" (UID: "b75234e4-2fea-42eb-9534-36f51bab38ff") : secret "dns-default-metrics-tls" not found Apr 17 16:53:22.364420 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:22.364382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:22.364420 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:22.364426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:22.364681 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:22.364624 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:53:22.364745 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:22.364710 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:24.364689478 +0000 UTC m=+43.764117989 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : secret "router-metrics-certs-default" not found Apr 17 16:53:22.365093 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:22.365074 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:24.365063584 +0000 UTC m=+43.764492094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : configmap references non-existent config key: service-ca.crt Apr 17 16:53:24.380703 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:24.380648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:24.381123 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:24.380717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:24.381123 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:24.380804 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:28.380782368 +0000 UTC m=+47.780210890 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : configmap references non-existent config key: service-ca.crt Apr 17 16:53:24.381123 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:24.380841 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:53:24.381123 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:24.380905 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:28.380888908 +0000 UTC m=+47.780317418 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : secret "router-metrics-certs-default" not found Apr 17 16:53:26.310393 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.310359 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw"] Apr 17 16:53:26.331962 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.331923 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw"] Apr 17 16:53:26.332097 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.332074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" Apr 17 16:53:26.335077 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.335057 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2dxvp\"" Apr 17 16:53:26.335077 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.335066 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 16:53:26.335241 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.335059 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 16:53:26.395514 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.395486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vsv\" (UniqueName: \"kubernetes.io/projected/1f436f86-bd41-4ff1-840b-69ba06f80f04-kube-api-access-z7vsv\") pod \"migrator-74bb7799d9-rptsw\" (UID: \"1f436f86-bd41-4ff1-840b-69ba06f80f04\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" Apr 17 16:53:26.455744 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.455707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" event={"ID":"d8f8e345-311f-4173-a487-5da18bb5d557","Type":"ContainerStarted","Data":"623b702be3a8fc14f1945f7a18dd41d8b2d2e33f3b8408baa9c9a067d7736355"} Apr 17 16:53:26.457074 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.457051 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4" event={"ID":"6975d595-5cf0-46b8-9850-ebe9f9ad039f","Type":"ContainerStarted","Data":"1f9c281a3f4e26ec3ca44ba35a122826f2fa7fd180958f38ff37f62d625be99c"} Apr 17 16:53:26.458362 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.458343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h" event={"ID":"ae8bba54-68da-4af3-803f-85e7bd8a4b87","Type":"ContainerStarted","Data":"cee7da03e6c3b2ee645f36778b8bebbb78ed0ecd6e17827bf6793e760d2fd44f"} Apr 17 16:53:26.459901 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.459887 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/0.log" Apr 17 16:53:26.459974 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.459921 2574 generic.go:358] "Generic (PLEG): container finished" podID="5247795a-9811-4fad-b182-136cc56544fd" containerID="1a1a4c9982de98faa11114b5f28320e898dd685b0d08cc622699125e5372326d" exitCode=255 Apr 17 16:53:26.459974 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.459962 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" event={"ID":"5247795a-9811-4fad-b182-136cc56544fd","Type":"ContainerDied","Data":"1a1a4c9982de98faa11114b5f28320e898dd685b0d08cc622699125e5372326d"} Apr 17 16:53:26.460140 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.460120 2574 scope.go:117] "RemoveContainer" containerID="1a1a4c9982de98faa11114b5f28320e898dd685b0d08cc622699125e5372326d" Apr 17 16:53:26.472785 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.472747 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" podStartSLOduration=2.129911182 podStartE2EDuration="6.472736247s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:21.148891253 +0000 UTC m=+40.548319763" lastFinishedPulling="2026-04-17 16:53:25.49171631 +0000 UTC m=+44.891144828" observedRunningTime="2026-04-17 16:53:26.472115542 +0000 UTC m=+45.871544076" watchObservedRunningTime="2026-04-17 16:53:26.472736247 +0000 UTC m=+45.872164779" Apr 17 16:53:26.496629 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.496563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7vsv\" (UniqueName: \"kubernetes.io/projected/1f436f86-bd41-4ff1-840b-69ba06f80f04-kube-api-access-z7vsv\") pod \"migrator-74bb7799d9-rptsw\" (UID: \"1f436f86-bd41-4ff1-840b-69ba06f80f04\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" Apr 17 16:53:26.505058 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.505032 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7vsv\" (UniqueName: \"kubernetes.io/projected/1f436f86-bd41-4ff1-840b-69ba06f80f04-kube-api-access-z7vsv\") pod \"migrator-74bb7799d9-rptsw\" (UID: \"1f436f86-bd41-4ff1-840b-69ba06f80f04\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" Apr 17 16:53:26.511099 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.511054 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4tw8h" podStartSLOduration=1.976351673 podStartE2EDuration="6.511037249s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:20.966785233 +0000 UTC m=+40.366213747" lastFinishedPulling="2026-04-17 16:53:25.501470813 +0000 UTC m=+44.900899323" observedRunningTime="2026-04-17 16:53:26.509984623 +0000 UTC m=+45.909413156" watchObservedRunningTime="2026-04-17 16:53:26.511037249 +0000 UTC m=+45.910465780" Apr 17 16:53:26.524606 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.524560 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-d8lj4" podStartSLOduration=2.00793738 podStartE2EDuration="6.524547747s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:20.975438822 +0000 UTC m=+40.374867335" lastFinishedPulling="2026-04-17 16:53:25.492049191 +0000 UTC m=+44.891477702" observedRunningTime="2026-04-17 16:53:26.523905491 +0000 UTC m=+45.923334024" watchObservedRunningTime="2026-04-17 16:53:26.524547747 +0000 UTC m=+45.923976279" Apr 17 16:53:26.641008 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.640986 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" Apr 17 16:53:26.755438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.755408 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw"] Apr 17 16:53:26.758208 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:26.758177 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f436f86_bd41_4ff1_840b_69ba06f80f04.slice/crio-1d8738183cd5c1868c233df73dab0156030af8d11b1941bc5661b503874b4021 WatchSource:0}: Error finding container 1d8738183cd5c1868c233df73dab0156030af8d11b1941bc5661b503874b4021: Status 404 returned error can't find the container with id 1d8738183cd5c1868c233df73dab0156030af8d11b1941bc5661b503874b4021 Apr 17 16:53:26.972326 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:26.972251 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fphrb_fa2701e6-325c-4b24-9bcb-827a9099143e/dns-node-resolver/0.log" Apr 17 16:53:27.463800 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:27.463761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" event={"ID":"1f436f86-bd41-4ff1-840b-69ba06f80f04","Type":"ContainerStarted","Data":"1d8738183cd5c1868c233df73dab0156030af8d11b1941bc5661b503874b4021"} Apr 17 16:53:27.465402 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:27.465382 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 16:53:27.465789 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:27.465774 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/0.log" Apr 17 16:53:27.465881 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:27.465810 2574 generic.go:358] "Generic (PLEG): container finished" podID="5247795a-9811-4fad-b182-136cc56544fd" containerID="b9b2ed0bbc2526b13d7c3e5e5ce0ce382ab56790106394643666f62d1bfc25cb" exitCode=255 Apr 17 16:53:27.465941 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:27.465902 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" event={"ID":"5247795a-9811-4fad-b182-136cc56544fd","Type":"ContainerDied","Data":"b9b2ed0bbc2526b13d7c3e5e5ce0ce382ab56790106394643666f62d1bfc25cb"} Apr 17 16:53:27.465987 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:27.465955 2574 scope.go:117] "RemoveContainer" containerID="1a1a4c9982de98faa11114b5f28320e898dd685b0d08cc622699125e5372326d" Apr 17 16:53:27.466279 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:27.466262 2574 scope.go:117] "RemoveContainer" containerID="b9b2ed0bbc2526b13d7c3e5e5ce0ce382ab56790106394643666f62d1bfc25cb" Apr 17 16:53:27.466534 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:27.466509 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ld9dq_openshift-console-operator(5247795a-9811-4fad-b182-136cc56544fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" podUID="5247795a-9811-4fad-b182-136cc56544fd" Apr 17 16:53:27.971794 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:27.971771 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-68lhq_22ae10bb-5884-4fb4-9c4a-473df80ffa49/node-ca/0.log" Apr 17 16:53:28.412098 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:28.412071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:28.412224 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:28.412128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:28.412224 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:28.412189 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:36.412165977 +0000 UTC m=+55.811594492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : configmap references non-existent config key: service-ca.crt Apr 17 16:53:28.412314 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:28.412252 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:53:28.412314 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:28.412299 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:36.412288123 +0000 UTC m=+55.811716634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : secret "router-metrics-certs-default" not found Apr 17 16:53:28.469583 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:28.469548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" event={"ID":"1f436f86-bd41-4ff1-840b-69ba06f80f04","Type":"ContainerStarted","Data":"9a51a0ed116e7dd8ba629b842a4d51cfd1323f7d9145eb51d0b260362368c131"} Apr 17 16:53:28.471023 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:28.471005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 16:53:28.471492 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:28.471403 2574 scope.go:117] "RemoveContainer" containerID="b9b2ed0bbc2526b13d7c3e5e5ce0ce382ab56790106394643666f62d1bfc25cb" Apr 17 16:53:28.471623 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:28.471604 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ld9dq_openshift-console-operator(5247795a-9811-4fad-b182-136cc56544fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" podUID="5247795a-9811-4fad-b182-136cc56544fd" Apr 17 16:53:29.475788 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:29.475705 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" event={"ID":"1f436f86-bd41-4ff1-840b-69ba06f80f04","Type":"ContainerStarted","Data":"c4432f693ed85e99ab4cf52e57fe559f760a047125deb060bb41e3b7f01eb1c5"} Apr 17 16:53:29.493180 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:29.493121 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-rptsw" podStartSLOduration=1.949276402 podStartE2EDuration="3.493107317s" podCreationTimestamp="2026-04-17 16:53:26 +0000 UTC" firstStartedPulling="2026-04-17 16:53:26.760359875 +0000 UTC m=+46.159788384" lastFinishedPulling="2026-04-17 16:53:28.304190789 +0000 UTC m=+47.703619299" observedRunningTime="2026-04-17 16:53:29.491317465 +0000 UTC m=+48.890745998" watchObservedRunningTime="2026-04-17 16:53:29.493107317 +0000 UTC m=+48.892535849" Apr 17 16:53:29.923819 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:29.923780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:29.923991 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:29.923895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:29.923991 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:29.923935 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:53:29.924121 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:29.924009 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert podName:4aec4098-9bf2-45c1-beb6-dcd84bb6ca17 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:45.923993417 +0000 UTC m=+65.323421944 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert") pod "ingress-canary-66g28" (UID: "4aec4098-9bf2-45c1-beb6-dcd84bb6ca17") : secret "canary-serving-cert" not found Apr 17 16:53:29.924121 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:29.924009 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:53:29.924121 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:29.924061 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls podName:b75234e4-2fea-42eb-9534-36f51bab38ff nodeName:}" failed. No retries permitted until 2026-04-17 16:53:45.924045346 +0000 UTC m=+65.323473857 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls") pod "dns-default-5zhl2" (UID: "b75234e4-2fea-42eb-9534-36f51bab38ff") : secret "dns-default-metrics-tls" not found Apr 17 16:53:30.839317 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:30.839276 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:30.839317 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:30.839319 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:30.839912 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:30.839789 2574 scope.go:117] "RemoveContainer" containerID="b9b2ed0bbc2526b13d7c3e5e5ce0ce382ab56790106394643666f62d1bfc25cb" Apr 17 16:53:30.840016 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:30.839994 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ld9dq_openshift-console-operator(5247795a-9811-4fad-b182-136cc56544fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" podUID="5247795a-9811-4fad-b182-136cc56544fd" Apr 17 16:53:36.473861 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:36.473792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:36.473861 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:36.473860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:36.474493 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:36.474002 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:53:36.474493 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:36.474011 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:52.473984256 +0000 UTC m=+71.873412768 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : configmap references non-existent config key: service-ca.crt Apr 17 16:53:36.474493 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:53:36.474061 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs podName:1224934d-3953-4719-a6a4-8ca929c1d869 nodeName:}" failed. No retries permitted until 2026-04-17 16:53:52.474044323 +0000 UTC m=+71.873472842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs") pod "router-default-55c74c7f8d-k5d4w" (UID: "1224934d-3953-4719-a6a4-8ca929c1d869") : secret "router-metrics-certs-default" not found Apr 17 16:53:39.416354 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:39.416326 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9zlb2" Apr 17 16:53:45.949373 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.949334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:45.949904 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.949387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:45.949904 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.949407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:45.951916 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.951882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b75234e4-2fea-42eb-9534-36f51bab38ff-metrics-tls\") pod \"dns-default-5zhl2\" (UID: \"b75234e4-2fea-42eb-9534-36f51bab38ff\") " pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:45.952031 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.952015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4aec4098-9bf2-45c1-beb6-dcd84bb6ca17-cert\") pod \"ingress-canary-66g28\" (UID: \"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17\") " pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:45.952244 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.952228 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:53:45.962541 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.962519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9f04956-3cc9-4095-a965-b3737339bb37-metrics-certs\") pod \"network-metrics-daemon-7t66h\" (UID: \"c9f04956-3cc9-4095-a965-b3737339bb37\") " pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:45.982126 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.982102 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bkwcz\"" Apr 17 16:53:45.990299 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:45.990281 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7t66h" Apr 17 16:53:46.050571 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.050424 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:46.053888 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.053821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggrd4\" (UniqueName: \"kubernetes.io/projected/9ea89042-5289-40b6-8778-7ab4e248e54b-kube-api-access-ggrd4\") pod \"network-check-target-pjg74\" (UID: \"9ea89042-5289-40b6-8778-7ab4e248e54b\") " pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:46.102179 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.102147 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7t66h"] Apr 17 16:53:46.105317 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:46.105289 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f04956_3cc9_4095_a965_b3737339bb37.slice/crio-c26263ef1e632410f026b4adcc3ff2d8cd9fe81ed61f0132ebd070e79f5d1d1f WatchSource:0}: Error finding container c26263ef1e632410f026b4adcc3ff2d8cd9fe81ed61f0132ebd070e79f5d1d1f: Status 404 returned error can't find the container with id c26263ef1e632410f026b4adcc3ff2d8cd9fe81ed61f0132ebd070e79f5d1d1f Apr 17 16:53:46.159567 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.159539 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8mbmc\"" Apr 17 16:53:46.163331 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.163317 2574 scope.go:117] "RemoveContainer" containerID="b9b2ed0bbc2526b13d7c3e5e5ce0ce382ab56790106394643666f62d1bfc25cb" Apr 17 16:53:46.166219 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.166202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:46.171107 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.171093 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bkp6b\"" Apr 17 16:53:46.179444 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.179424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-66g28" Apr 17 16:53:46.284343 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.283933 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wm7r2\"" Apr 17 16:53:46.284343 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.284108 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:46.302054 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.301993 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5zhl2"] Apr 17 16:53:46.320643 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.320617 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-66g28"] Apr 17 16:53:46.325066 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:46.325039 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aec4098_9bf2_45c1_beb6_dcd84bb6ca17.slice/crio-6cb3b0e307f95213d9932b50739802078c2d43742d68a956b4031f9e44004420 WatchSource:0}: Error finding container 6cb3b0e307f95213d9932b50739802078c2d43742d68a956b4031f9e44004420: Status 404 returned error can't find the container with id 6cb3b0e307f95213d9932b50739802078c2d43742d68a956b4031f9e44004420 Apr 17 16:53:46.410825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.410791 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pjg74"] Apr 17 16:53:46.413742 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:46.413718 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ea89042_5289_40b6_8778_7ab4e248e54b.slice/crio-2f5d5a12fb53916fc1c7314f1b068a0c35c411c9b0f2783304a309f6daebeb34 WatchSource:0}: Error finding container 2f5d5a12fb53916fc1c7314f1b068a0c35c411c9b0f2783304a309f6daebeb34: Status 404 returned error can't find the container with id 2f5d5a12fb53916fc1c7314f1b068a0c35c411c9b0f2783304a309f6daebeb34 Apr 17 16:53:46.522181 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.522139 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pjg74" event={"ID":"9ea89042-5289-40b6-8778-7ab4e248e54b","Type":"ContainerStarted","Data":"512a93c7d243c8539e69edc371f27bc96329a9b8a86afa208640e2d5f3296263"} Apr 17 16:53:46.522181 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.522184 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pjg74" event={"ID":"9ea89042-5289-40b6-8778-7ab4e248e54b","Type":"ContainerStarted","Data":"2f5d5a12fb53916fc1c7314f1b068a0c35c411c9b0f2783304a309f6daebeb34"} Apr 17 16:53:46.522397 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.522262 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:53:46.523943 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.523927 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 16:53:46.524035 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.523989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" event={"ID":"5247795a-9811-4fad-b182-136cc56544fd","Type":"ContainerStarted","Data":"2baf6f2fc452f49e5caa39b5ae290ebe635dd3ac57daaa8a21cf01a3094df450"} Apr 17 16:53:46.524256 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.524241 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:46.525600 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.525320 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-66g28" event={"ID":"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17","Type":"ContainerStarted","Data":"6cb3b0e307f95213d9932b50739802078c2d43742d68a956b4031f9e44004420"} Apr 17 16:53:46.526977 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.526621 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5zhl2" event={"ID":"b75234e4-2fea-42eb-9534-36f51bab38ff","Type":"ContainerStarted","Data":"929d4f922e3336495293174b0a75b2f9d368ce4c2ce43f0c5afbd8c9c2a674f7"} Apr 17 16:53:46.527851 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.527827 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7t66h" event={"ID":"c9f04956-3cc9-4095-a965-b3737339bb37","Type":"ContainerStarted","Data":"c26263ef1e632410f026b4adcc3ff2d8cd9fe81ed61f0132ebd070e79f5d1d1f"} Apr 17 16:53:46.539947 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.539900 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pjg74" podStartSLOduration=65.539888403 podStartE2EDuration="1m5.539888403s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:46.538836242 +0000 UTC m=+65.938264775" watchObservedRunningTime="2026-04-17 16:53:46.539888403 +0000 UTC m=+65.939316934" Apr 17 16:53:46.556085 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.556034 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" podStartSLOduration=22.059029276 podStartE2EDuration="26.556018095s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="2026-04-17 16:53:20.994731491 +0000 UTC m=+40.394160004" lastFinishedPulling="2026-04-17 16:53:25.491720299 +0000 UTC m=+44.891148823" observedRunningTime="2026-04-17 16:53:46.555379093 +0000 UTC m=+65.954807625" watchObservedRunningTime="2026-04-17 16:53:46.556018095 +0000 UTC m=+65.955446630" Apr 17 16:53:46.937425 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:46.937220 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ld9dq" Apr 17 16:53:49.539120 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.539083 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-66g28" event={"ID":"4aec4098-9bf2-45c1-beb6-dcd84bb6ca17","Type":"ContainerStarted","Data":"c43ab8b76631dba74be35da2f115157a006c91719314def8c690ea748d1de926"} Apr 17 16:53:49.540788 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.540758 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5zhl2" event={"ID":"b75234e4-2fea-42eb-9534-36f51bab38ff","Type":"ContainerStarted","Data":"0e471875e3295d3078c426bf662278333aff56fc24ee2f5af321fa251078b8e8"} Apr 17 16:53:49.540788 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.540787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5zhl2" event={"ID":"b75234e4-2fea-42eb-9534-36f51bab38ff","Type":"ContainerStarted","Data":"bc12b1d87c9ca0e24c469f3c6db93f00efa00173d8555924261d6290f7fe70d9"} Apr 17 16:53:49.540959 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.540875 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5zhl2" Apr 17 16:53:49.542234 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.542210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7t66h" event={"ID":"c9f04956-3cc9-4095-a965-b3737339bb37","Type":"ContainerStarted","Data":"a8e7da241c4d52e13acae574272e0451104bbfb5eef584f8e255aa1e726ad3e9"} Apr 17 16:53:49.542311 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.542241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7t66h" event={"ID":"c9f04956-3cc9-4095-a965-b3737339bb37","Type":"ContainerStarted","Data":"97b07912a9a00e5ea8eee3e4498c0351bb7e929b07dd565d0cf361c1f5448ce6"} Apr 17 16:53:49.556076 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.556034 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-66g28" podStartSLOduration=34.422109372 podStartE2EDuration="36.556019353s" podCreationTimestamp="2026-04-17 16:53:13 +0000 UTC" firstStartedPulling="2026-04-17 16:53:46.327360731 +0000 UTC m=+65.726789245" lastFinishedPulling="2026-04-17 16:53:48.461270713 +0000 UTC m=+67.860699226" observedRunningTime="2026-04-17 16:53:49.554603496 +0000 UTC m=+68.954032029" watchObservedRunningTime="2026-04-17 16:53:49.556019353 +0000 UTC m=+68.955447883" Apr 17 16:53:49.572083 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.572035 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7t66h" podStartSLOduration=66.226138011 podStartE2EDuration="1m8.572019722s" podCreationTimestamp="2026-04-17 16:52:41 +0000 UTC" firstStartedPulling="2026-04-17 16:53:46.107134799 +0000 UTC m=+65.506563309" lastFinishedPulling="2026-04-17 16:53:48.453016508 +0000 UTC m=+67.852445020" observedRunningTime="2026-04-17 16:53:49.571601366 +0000 UTC m=+68.971029898" watchObservedRunningTime="2026-04-17 16:53:49.572019722 +0000 UTC m=+68.971448255" Apr 17 16:53:49.588837 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:49.588795 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5zhl2" podStartSLOduration=34.439292024 podStartE2EDuration="36.588783526s" podCreationTimestamp="2026-04-17 16:53:13 +0000 UTC" firstStartedPulling="2026-04-17 16:53:46.307631377 +0000 UTC m=+65.707059887" lastFinishedPulling="2026-04-17 16:53:48.457122876 +0000 UTC m=+67.856551389" observedRunningTime="2026-04-17 16:53:49.588586238 +0000 UTC m=+68.988014769" watchObservedRunningTime="2026-04-17 16:53:49.588783526 +0000 UTC m=+68.988212058" Apr 17 16:53:50.992962 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:50.992929 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-z564x"] Apr 17 16:53:50.995945 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:50.995926 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-z564x" Apr 17 16:53:50.998944 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:50.998925 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-lzzqf\"" Apr 17 16:53:50.999172 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:50.999159 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:53:50.999225 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:50.999210 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:53:51.009114 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.009092 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-z564x"] Apr 17 16:53:51.088792 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.088757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcbw7\" (UniqueName: \"kubernetes.io/projected/2b6e75b0-f8cc-4928-b513-ab3bff7a99e6-kube-api-access-bcbw7\") pod \"downloads-6bcc868b7-z564x\" (UID: \"2b6e75b0-f8cc-4928-b513-ab3bff7a99e6\") " pod="openshift-console/downloads-6bcc868b7-z564x" Apr 17 16:53:51.098006 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.097976 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86499485b7-58j9g"] Apr 17 16:53:51.100987 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.100970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.107927 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.104092 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:53:51.107927 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.104135 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:53:51.107927 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.104214 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:53:51.107927 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.105975 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-97wlb\"" Apr 17 16:53:51.111370 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.111349 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:53:51.116490 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.116467 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86499485b7-58j9g"] Apr 17 16:53:51.117338 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.117321 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cvvzl"] Apr 17 16:53:51.120378 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.120361 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.123243 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.123202 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:53:51.123243 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.123221 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:53:51.123243 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.123247 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:53:51.123491 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.123479 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:53:51.128967 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.128951 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-whh7q\"" Apr 17 16:53:51.140873 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.140851 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cvvzl"] Apr 17 16:53:51.189848 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.189815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5365987-3342-41ca-b42f-381c6079bd7f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.190054 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.189860 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/41155a82-d32d-42cb-8659-1373cd03f8ce-image-registry-private-configuration\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.190054 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.189892 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5365987-3342-41ca-b42f-381c6079bd7f-data-volume\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.190054 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.189984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41155a82-d32d-42cb-8659-1373cd03f8ce-registry-certificates\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.190054 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190030 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41155a82-d32d-42cb-8659-1373cd03f8ce-ca-trust-extracted\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.190213 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41155a82-d32d-42cb-8659-1373cd03f8ce-trusted-ca\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.190213 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-bound-sa-token\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.190213 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41155a82-d32d-42cb-8659-1373cd03f8ce-installation-pull-secrets\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.190213 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-registry-tls\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.190213 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nj4q\" (UniqueName: \"kubernetes.io/projected/c5365987-3342-41ca-b42f-381c6079bd7f-kube-api-access-8nj4q\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.190393 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knd2g\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-kube-api-access-knd2g\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.190393 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcbw7\" (UniqueName: \"kubernetes.io/projected/2b6e75b0-f8cc-4928-b513-ab3bff7a99e6-kube-api-access-bcbw7\") pod \"downloads-6bcc868b7-z564x\" (UID: \"2b6e75b0-f8cc-4928-b513-ab3bff7a99e6\") " pod="openshift-console/downloads-6bcc868b7-z564x" Apr 17 16:53:51.190393 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5365987-3342-41ca-b42f-381c6079bd7f-crio-socket\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.190393 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.190319 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5365987-3342-41ca-b42f-381c6079bd7f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.199943 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.199915 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcbw7\" (UniqueName: \"kubernetes.io/projected/2b6e75b0-f8cc-4928-b513-ab3bff7a99e6-kube-api-access-bcbw7\") pod \"downloads-6bcc868b7-z564x\" (UID: \"2b6e75b0-f8cc-4928-b513-ab3bff7a99e6\") " pod="openshift-console/downloads-6bcc868b7-z564x" Apr 17 16:53:51.291487 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/41155a82-d32d-42cb-8659-1373cd03f8ce-image-registry-private-configuration\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.291702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5365987-3342-41ca-b42f-381c6079bd7f-data-volume\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.291702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41155a82-d32d-42cb-8659-1373cd03f8ce-registry-certificates\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.291702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41155a82-d32d-42cb-8659-1373cd03f8ce-ca-trust-extracted\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.291864 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41155a82-d32d-42cb-8659-1373cd03f8ce-trusted-ca\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.291864 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-bound-sa-token\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.291864 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41155a82-d32d-42cb-8659-1373cd03f8ce-installation-pull-secrets\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.291864 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-registry-tls\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.292053 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nj4q\" (UniqueName: \"kubernetes.io/projected/c5365987-3342-41ca-b42f-381c6079bd7f-kube-api-access-8nj4q\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.292053 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knd2g\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-kube-api-access-knd2g\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.292053 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5365987-3342-41ca-b42f-381c6079bd7f-data-volume\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.292053 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291966 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5365987-3342-41ca-b42f-381c6079bd7f-crio-socket\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.292053 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.291999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5365987-3342-41ca-b42f-381c6079bd7f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.292351 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.292085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5365987-3342-41ca-b42f-381c6079bd7f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.292649 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.292434 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/41155a82-d32d-42cb-8659-1373cd03f8ce-registry-certificates\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.292649 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.292467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c5365987-3342-41ca-b42f-381c6079bd7f-crio-socket\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.292649 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.292624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c5365987-3342-41ca-b42f-381c6079bd7f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.292649 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.292647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/41155a82-d32d-42cb-8659-1373cd03f8ce-ca-trust-extracted\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.293092 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.292717 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/41155a82-d32d-42cb-8659-1373cd03f8ce-trusted-ca\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.294776 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.294752 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-registry-tls\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.294897 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.294776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/41155a82-d32d-42cb-8659-1373cd03f8ce-installation-pull-secrets\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.294897 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.294869 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/41155a82-d32d-42cb-8659-1373cd03f8ce-image-registry-private-configuration\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.295315 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.295294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c5365987-3342-41ca-b42f-381c6079bd7f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.303837 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.303818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-bound-sa-token\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.303944 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.303915 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-z564x" Apr 17 16:53:51.305506 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.305480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nj4q\" (UniqueName: \"kubernetes.io/projected/c5365987-3342-41ca-b42f-381c6079bd7f-kube-api-access-8nj4q\") pod \"insights-runtime-extractor-cvvzl\" (UID: \"c5365987-3342-41ca-b42f-381c6079bd7f\") " pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.305586 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.305481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knd2g\" (UniqueName: \"kubernetes.io/projected/41155a82-d32d-42cb-8659-1373cd03f8ce-kube-api-access-knd2g\") pod \"image-registry-86499485b7-58j9g\" (UID: \"41155a82-d32d-42cb-8659-1373cd03f8ce\") " pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.414365 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.414339 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:51.426041 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.426016 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-z564x"] Apr 17 16:53:51.428985 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.428961 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cvvzl" Apr 17 16:53:51.429088 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:51.429060 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b6e75b0_f8cc_4928_b513_ab3bff7a99e6.slice/crio-ad493ffeb760006433eb74cf283bf7b5f82b050fc9b44694cf5c5822a80ac436 WatchSource:0}: Error finding container ad493ffeb760006433eb74cf283bf7b5f82b050fc9b44694cf5c5822a80ac436: Status 404 returned error can't find the container with id ad493ffeb760006433eb74cf283bf7b5f82b050fc9b44694cf5c5822a80ac436 Apr 17 16:53:51.549819 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.549786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-z564x" event={"ID":"2b6e75b0-f8cc-4928-b513-ab3bff7a99e6","Type":"ContainerStarted","Data":"ad493ffeb760006433eb74cf283bf7b5f82b050fc9b44694cf5c5822a80ac436"} Apr 17 16:53:51.549944 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.549922 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86499485b7-58j9g"] Apr 17 16:53:51.552288 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:51.552261 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41155a82_d32d_42cb_8659_1373cd03f8ce.slice/crio-6c105b162c208e00fd01d2db2656fa58178b6f1b864c7aa6fa08c94f5ea4a9cd WatchSource:0}: Error finding container 6c105b162c208e00fd01d2db2656fa58178b6f1b864c7aa6fa08c94f5ea4a9cd: Status 404 returned error can't find the container with id 6c105b162c208e00fd01d2db2656fa58178b6f1b864c7aa6fa08c94f5ea4a9cd Apr 17 16:53:51.563440 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:51.563364 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cvvzl"] Apr 17 16:53:51.571706 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:51.571618 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5365987_3342_41ca_b42f_381c6079bd7f.slice/crio-0cd99a7b193acfc27dab1f4f511832e288c6c28edc2a9f93b94239ca030319e3 WatchSource:0}: Error finding container 0cd99a7b193acfc27dab1f4f511832e288c6c28edc2a9f93b94239ca030319e3: Status 404 returned error can't find the container with id 0cd99a7b193acfc27dab1f4f511832e288c6c28edc2a9f93b94239ca030319e3 Apr 17 16:53:52.501635 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.501585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:52.502086 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.501692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:52.502374 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.502353 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1224934d-3953-4719-a6a4-8ca929c1d869-service-ca-bundle\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:52.504461 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.504434 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1224934d-3953-4719-a6a4-8ca929c1d869-metrics-certs\") pod \"router-default-55c74c7f8d-k5d4w\" (UID: \"1224934d-3953-4719-a6a4-8ca929c1d869\") " pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:52.554465 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.554381 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cvvzl" event={"ID":"c5365987-3342-41ca-b42f-381c6079bd7f","Type":"ContainerStarted","Data":"97550e5ca91297d70d76541efda081ef8f20b9ed32b2b98ce75939b3ce6ddfc6"} Apr 17 16:53:52.554465 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.554428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cvvzl" event={"ID":"c5365987-3342-41ca-b42f-381c6079bd7f","Type":"ContainerStarted","Data":"ff4836cea481ddebd967f593c859a0b03ca30d9db1953f8c7e72323cd5f90ea9"} Apr 17 16:53:52.554465 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.554443 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cvvzl" event={"ID":"c5365987-3342-41ca-b42f-381c6079bd7f","Type":"ContainerStarted","Data":"0cd99a7b193acfc27dab1f4f511832e288c6c28edc2a9f93b94239ca030319e3"} Apr 17 16:53:52.555793 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.555760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86499485b7-58j9g" event={"ID":"41155a82-d32d-42cb-8659-1373cd03f8ce","Type":"ContainerStarted","Data":"73cc960ea896df918bbfae021fd2902aaeabfbec1b55c521c510f7672eac0f92"} Apr 17 16:53:52.555793 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.555786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86499485b7-58j9g" event={"ID":"41155a82-d32d-42cb-8659-1373cd03f8ce","Type":"ContainerStarted","Data":"6c105b162c208e00fd01d2db2656fa58178b6f1b864c7aa6fa08c94f5ea4a9cd"} Apr 17 16:53:52.555974 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.555960 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:53:52.578469 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.578423 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86499485b7-58j9g" podStartSLOduration=1.5784075560000002 podStartE2EDuration="1.578407556s" podCreationTimestamp="2026-04-17 16:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:52.578290127 +0000 UTC m=+71.977718660" watchObservedRunningTime="2026-04-17 16:53:52.578407556 +0000 UTC m=+71.977836091" Apr 17 16:53:52.711571 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.711535 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2w2sj\"" Apr 17 16:53:52.719697 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.719648 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:52.881957 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:52.881928 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-55c74c7f8d-k5d4w"] Apr 17 16:53:52.885252 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:53:52.885219 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1224934d_3953_4719_a6a4_8ca929c1d869.slice/crio-57c6cdb4c6af926ebc9b40677058fa087b613a66b48e5e18ef59a5f26ae52e8d WatchSource:0}: Error finding container 57c6cdb4c6af926ebc9b40677058fa087b613a66b48e5e18ef59a5f26ae52e8d: Status 404 returned error can't find the container with id 57c6cdb4c6af926ebc9b40677058fa087b613a66b48e5e18ef59a5f26ae52e8d Apr 17 16:53:53.563644 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:53.563603 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" event={"ID":"1224934d-3953-4719-a6a4-8ca929c1d869","Type":"ContainerStarted","Data":"109820dba6e93a12984e346db0523ca3dd16df7fb9ea5b865d3d726e296b4fda"} Apr 17 16:53:53.564129 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:53.563687 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" event={"ID":"1224934d-3953-4719-a6a4-8ca929c1d869","Type":"ContainerStarted","Data":"57c6cdb4c6af926ebc9b40677058fa087b613a66b48e5e18ef59a5f26ae52e8d"} Apr 17 16:53:53.581972 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:53.581912 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" podStartSLOduration=33.58189425 podStartE2EDuration="33.58189425s" podCreationTimestamp="2026-04-17 16:53:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:53.580570204 +0000 UTC m=+72.979998730" watchObservedRunningTime="2026-04-17 16:53:53.58189425 +0000 UTC m=+72.981322783" Apr 17 16:53:53.720365 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:53.720325 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:53.723464 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:53.723438 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:54.572268 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:54.572181 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cvvzl" event={"ID":"c5365987-3342-41ca-b42f-381c6079bd7f","Type":"ContainerStarted","Data":"53147c22d525eddc84c27b75000bd6378c2108c28d4e73d9f5f4637d70a8b978"} Apr 17 16:53:54.572776 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:54.572750 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:54.573866 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:54.573845 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-55c74c7f8d-k5d4w" Apr 17 16:53:54.589103 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:54.589049 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cvvzl" podStartSLOduration=0.942480384 podStartE2EDuration="3.589035821s" podCreationTimestamp="2026-04-17 16:53:51 +0000 UTC" firstStartedPulling="2026-04-17 16:53:51.625501433 +0000 UTC m=+71.024929945" lastFinishedPulling="2026-04-17 16:53:54.272056869 +0000 UTC m=+73.671485382" observedRunningTime="2026-04-17 16:53:54.588822627 +0000 UTC m=+73.988251159" watchObservedRunningTime="2026-04-17 16:53:54.589035821 +0000 UTC m=+73.988464352" Apr 17 16:53:59.547579 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:53:59.547459 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5zhl2" Apr 17 16:54:03.463157 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.463114 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jvkrm"] Apr 17 16:54:03.468546 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.468523 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.471372 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.471248 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:54:03.471372 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.471253 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:54:03.471372 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.471318 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:54:03.471372 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.471253 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:54:03.471690 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.471536 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-z9mwd\"" Apr 17 16:54:03.472600 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.472580 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:54:03.472772 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.472587 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:54:03.599916 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.599879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-metrics-client-ca\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.600111 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.599932 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.600111 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.599973 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-accelerators-collector-config\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.600111 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.600057 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8qn\" (UniqueName: \"kubernetes.io/projected/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-kube-api-access-wd8qn\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.600284 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.600129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-textfile\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.600284 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.600157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-tls\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.600284 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.600186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-sys\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.600284 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.600230 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-wtmp\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.600284 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.600274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-root\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-root\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-metrics-client-ca\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701601 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701601 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-accelerators-collector-config\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701601 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-root\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701601 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701481 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8qn\" (UniqueName: \"kubernetes.io/projected/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-kube-api-access-wd8qn\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701601 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-textfile\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701601 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-tls\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701938 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-sys\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701938 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-wtmp\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701938 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-wtmp\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.701938 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.701906 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-sys\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.702144 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.702036 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-textfile\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.702144 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.702116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-metrics-client-ca\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.702238 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.702116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-accelerators-collector-config\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.704184 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.704161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.704323 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.704304 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-node-exporter-tls\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.710776 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.710757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8qn\" (UniqueName: \"kubernetes.io/projected/56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e-kube-api-access-wd8qn\") pod \"node-exporter-jvkrm\" (UID: \"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e\") " pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:03.779779 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:03.779750 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jvkrm" Apr 17 16:54:04.529398 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.529357 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:54:04.534937 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.534913 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.537751 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.537711 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:54:04.537900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.537799 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kjq2q\"" Apr 17 16:54:04.537900 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.537845 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:54:04.538045 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.537711 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:54:04.538045 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.538024 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:54:04.538159 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.538070 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:54:04.538235 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.538185 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:54:04.538299 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.538244 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:54:04.538299 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.538261 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:54:04.538988 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.538968 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:54:04.545811 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.545408 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:54:04.610858 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.610829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-volume\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611030 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.610869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611030 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.610955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611030 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611002 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-out\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611030 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611020 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611251 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611058 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-web-config\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611251 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcx9q\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-kube-api-access-fcx9q\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611251 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611251 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611213 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611251 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611442 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611319 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611442 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611357 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.611442 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.611392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712735 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-web-config\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712735 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcx9q\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-kube-api-access-fcx9q\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712762 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712824 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.712981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.712976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-volume\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.713349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.713002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.713349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.713035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.713349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.713063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-out\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.713349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.713086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.713349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.713321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.713635 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:54:04.713611 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle podName:f79711b8-86ad-4d30-9a85-627348a0e1f0 nodeName:}" failed. No retries permitted until 2026-04-17 16:54:05.213587448 +0000 UTC m=+84.613015979 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0") : configmap references non-existent config key: ca-bundle.crt Apr 17 16:54:04.715166 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.715134 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.716253 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.716230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-web-config\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.717597 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.716523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.717597 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.716946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.717597 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.717557 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.718631 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.718513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-out\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.718760 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.718738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-volume\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.718834 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.718738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.719004 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.718983 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.719710 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.719689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:04.721114 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:04.721097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcx9q\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-kube-api-access-fcx9q\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:05.218343 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:05.218301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:05.219218 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:05.219194 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:05.449484 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:05.449452 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:54:07.040535 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.040497 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-879b74458-h9jfj"] Apr 17 16:54:07.045729 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.045706 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.048687 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.048496 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:54:07.048687 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.048570 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:54:07.049717 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.049697 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:54:07.049992 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.049971 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:54:07.050113 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.050095 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-h7h8x\"" Apr 17 16:54:07.050440 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.050418 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:54:07.054322 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.054055 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:54:07.058623 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.058596 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-879b74458-h9jfj"] Apr 17 16:54:07.138155 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.138124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-serving-cert\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.138336 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.138175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-oauth-serving-cert\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.138336 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.138210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-trusted-ca-bundle\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.138442 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.138345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-service-ca\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.138442 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.138392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6x5\" (UniqueName: \"kubernetes.io/projected/3fbfc290-6339-4c96-a76a-774c6a277a0e-kube-api-access-zk6x5\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.138535 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.138447 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-oauth-config\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.138535 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.138475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-config\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.239567 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.239525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-service-ca\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.239777 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.239583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6x5\" (UniqueName: \"kubernetes.io/projected/3fbfc290-6339-4c96-a76a-774c6a277a0e-kube-api-access-zk6x5\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.239777 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.239606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-oauth-config\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.239982 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.239950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-config\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.240113 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.240020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-serving-cert\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.240113 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.240060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-oauth-serving-cert\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.240113 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.240098 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-trusted-ca-bundle\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.240352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.240323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-service-ca\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.240716 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.240689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-config\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.240808 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.240696 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-oauth-serving-cert\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.241242 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.241214 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-trusted-ca-bundle\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.242669 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.242633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-oauth-config\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.243210 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.243183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-serving-cert\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.249162 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.249118 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6x5\" (UniqueName: \"kubernetes.io/projected/3fbfc290-6339-4c96-a76a-774c6a277a0e-kube-api-access-zk6x5\") pod \"console-879b74458-h9jfj\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.291921 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:54:07.291848 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56bcbda8_7c8f_49c4_9b4f_39c52a6c6a7e.slice/crio-0dd3170d6d825ee7da3dd36b3ca30a1421aa655fc48226185d14e95622658cda WatchSource:0}: Error finding container 0dd3170d6d825ee7da3dd36b3ca30a1421aa655fc48226185d14e95622658cda: Status 404 returned error can't find the container with id 0dd3170d6d825ee7da3dd36b3ca30a1421aa655fc48226185d14e95622658cda Apr 17 16:54:07.358527 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.358498 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:07.423491 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.423286 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:54:07.427206 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:54:07.426787 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf79711b8_86ad_4d30_9a85_627348a0e1f0.slice/crio-caaaeff102e88e6f378a03f7ed4a9c458af496a18b2f27f0bae5135e37b3f1ab WatchSource:0}: Error finding container caaaeff102e88e6f378a03f7ed4a9c458af496a18b2f27f0bae5135e37b3f1ab: Status 404 returned error can't find the container with id caaaeff102e88e6f378a03f7ed4a9c458af496a18b2f27f0bae5135e37b3f1ab Apr 17 16:54:07.499513 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.499477 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-879b74458-h9jfj"] Apr 17 16:54:07.502266 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:54:07.502244 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fbfc290_6339_4c96_a76a_774c6a277a0e.slice/crio-311f1300c64b4a20f5bbb3e805f1eb0e701c27804a80dc77ec5dbda9fdcd33cf WatchSource:0}: Error finding container 311f1300c64b4a20f5bbb3e805f1eb0e701c27804a80dc77ec5dbda9fdcd33cf: Status 404 returned error can't find the container with id 311f1300c64b4a20f5bbb3e805f1eb0e701c27804a80dc77ec5dbda9fdcd33cf Apr 17 16:54:07.614671 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.614432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-z564x" event={"ID":"2b6e75b0-f8cc-4928-b513-ab3bff7a99e6","Type":"ContainerStarted","Data":"fed75b5152ce32b3296460ab5eed0c589dbe5523ebc5f20bcfc3b5d36b8e5ae0"} Apr 17 16:54:07.614671 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.614563 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-z564x" Apr 17 16:54:07.616369 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.616337 2574 patch_prober.go:28] interesting pod/downloads-6bcc868b7-z564x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.13:8080/\": dial tcp 10.134.0.13:8080: connect: connection refused" start-of-body= Apr 17 16:54:07.616504 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.616392 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-z564x" podUID="2b6e75b0-f8cc-4928-b513-ab3bff7a99e6" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.13:8080/\": dial tcp 10.134.0.13:8080: connect: connection refused" Apr 17 16:54:07.617295 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.617257 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-879b74458-h9jfj" event={"ID":"3fbfc290-6339-4c96-a76a-774c6a277a0e","Type":"ContainerStarted","Data":"311f1300c64b4a20f5bbb3e805f1eb0e701c27804a80dc77ec5dbda9fdcd33cf"} Apr 17 16:54:07.618789 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.618760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jvkrm" event={"ID":"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e","Type":"ContainerStarted","Data":"0dd3170d6d825ee7da3dd36b3ca30a1421aa655fc48226185d14e95622658cda"} Apr 17 16:54:07.620446 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.620250 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerStarted","Data":"caaaeff102e88e6f378a03f7ed4a9c458af496a18b2f27f0bae5135e37b3f1ab"} Apr 17 16:54:07.638337 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.638291 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-z564x" podStartSLOduration=1.680558187 podStartE2EDuration="17.638274751s" podCreationTimestamp="2026-04-17 16:53:50 +0000 UTC" firstStartedPulling="2026-04-17 16:53:51.435317663 +0000 UTC m=+70.834746176" lastFinishedPulling="2026-04-17 16:54:07.393034228 +0000 UTC m=+86.792462740" observedRunningTime="2026-04-17 16:54:07.635072327 +0000 UTC m=+87.034500860" watchObservedRunningTime="2026-04-17 16:54:07.638274751 +0000 UTC m=+87.037703284" Apr 17 16:54:07.920087 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.919998 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-c9b8f5d79-jmwjl"] Apr 17 16:54:07.925148 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.924880 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:07.928074 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.928040 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 16:54:07.928298 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.928270 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 16:54:07.928298 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.928296 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-r8v52\"" Apr 17 16:54:07.928898 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.928581 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-58p46gelqt9j5\"" Apr 17 16:54:07.928898 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.928683 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 16:54:07.928898 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.928584 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:54:07.938562 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:07.938523 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-c9b8f5d79-jmwjl"] Apr 17 16:54:08.048558 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.048532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-secret-metrics-server-client-certs\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.048953 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.048575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/70e43a28-718b-4cd3-aa08-d04ad07a8213-metrics-server-audit-profiles\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.048953 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.048643 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-client-ca-bundle\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.048953 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.048741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pxsm\" (UniqueName: \"kubernetes.io/projected/70e43a28-718b-4cd3-aa08-d04ad07a8213-kube-api-access-4pxsm\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.048953 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.048785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-secret-metrics-server-tls\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.048953 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.048837 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e43a28-718b-4cd3-aa08-d04ad07a8213-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.048953 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.048861 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/70e43a28-718b-4cd3-aa08-d04ad07a8213-audit-log\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.150533 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.150443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-secret-metrics-server-client-certs\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.150533 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.150512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/70e43a28-718b-4cd3-aa08-d04ad07a8213-metrics-server-audit-profiles\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.150533 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.150543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-client-ca-bundle\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.150533 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.150595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pxsm\" (UniqueName: \"kubernetes.io/projected/70e43a28-718b-4cd3-aa08-d04ad07a8213-kube-api-access-4pxsm\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.150533 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.150626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-secret-metrics-server-tls\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.151482 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.150701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e43a28-718b-4cd3-aa08-d04ad07a8213-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.151482 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.150729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/70e43a28-718b-4cd3-aa08-d04ad07a8213-audit-log\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.151482 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.151144 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/70e43a28-718b-4cd3-aa08-d04ad07a8213-audit-log\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.152091 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.152049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/70e43a28-718b-4cd3-aa08-d04ad07a8213-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.153522 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.153499 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/70e43a28-718b-4cd3-aa08-d04ad07a8213-metrics-server-audit-profiles\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.155192 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.155168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-secret-metrics-server-tls\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.158206 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.158156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-client-ca-bundle\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.170203 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.170113 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/70e43a28-718b-4cd3-aa08-d04ad07a8213-secret-metrics-server-client-certs\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.170312 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.170236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pxsm\" (UniqueName: \"kubernetes.io/projected/70e43a28-718b-4cd3-aa08-d04ad07a8213-kube-api-access-4pxsm\") pod \"metrics-server-c9b8f5d79-jmwjl\" (UID: \"70e43a28-718b-4cd3-aa08-d04ad07a8213\") " pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.238895 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.238861 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:08.411549 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.411514 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-c9b8f5d79-jmwjl"] Apr 17 16:54:08.626802 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.626756 2574 generic.go:358] "Generic (PLEG): container finished" podID="56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e" containerID="9c23fc9ebaaab67eeba9296b8867aad8d5f437e7d427c2b929b76e27f0ac0fd0" exitCode=0 Apr 17 16:54:08.626959 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.626852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jvkrm" event={"ID":"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e","Type":"ContainerDied","Data":"9c23fc9ebaaab67eeba9296b8867aad8d5f437e7d427c2b929b76e27f0ac0fd0"} Apr 17 16:54:08.630026 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.629972 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" event={"ID":"70e43a28-718b-4cd3-aa08-d04ad07a8213","Type":"ContainerStarted","Data":"2ce485ff89628860c572f3eab2576aa19b41475598eece9900bb1a76b81022fc"} Apr 17 16:54:08.641842 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:08.641800 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-z564x" Apr 17 16:54:09.634969 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:09.634937 2574 generic.go:358] "Generic (PLEG): container finished" podID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerID="92fb099517a1674437ddb1ab45bcf7e1a8bb687c196104c02be6fd722acd96a5" exitCode=0 Apr 17 16:54:09.635416 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:09.635326 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerDied","Data":"92fb099517a1674437ddb1ab45bcf7e1a8bb687c196104c02be6fd722acd96a5"} Apr 17 16:54:09.638924 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:09.638883 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jvkrm" event={"ID":"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e","Type":"ContainerStarted","Data":"9c2ea8a5cdfcbd2861c171035c492845bf95d1c2223b3043b12be20d3e0ddf58"} Apr 17 16:54:09.639106 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:09.638936 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jvkrm" event={"ID":"56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e","Type":"ContainerStarted","Data":"da49c7976c2902e3b9f80c7e3abc49bd4dd6331b9df03a0995daed26405cc0ef"} Apr 17 16:54:09.699772 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:09.699728 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jvkrm" podStartSLOduration=5.983567759 podStartE2EDuration="6.699714268s" podCreationTimestamp="2026-04-17 16:54:03 +0000 UTC" firstStartedPulling="2026-04-17 16:54:07.293829528 +0000 UTC m=+86.693258038" lastFinishedPulling="2026-04-17 16:54:08.00997603 +0000 UTC m=+87.409404547" observedRunningTime="2026-04-17 16:54:09.697650471 +0000 UTC m=+89.097079003" watchObservedRunningTime="2026-04-17 16:54:09.699714268 +0000 UTC m=+89.099142799" Apr 17 16:54:13.568470 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.568435 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86499485b7-58j9g" Apr 17 16:54:13.657231 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.657189 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerStarted","Data":"c40e82cb609e4c047d5e9686d5be6696437b8c62db7250de9df23dfeeb31bfec"} Apr 17 16:54:13.657231 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.657238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerStarted","Data":"92b3a44c4a71f9cb5ed7cec1668af9bc869b9eb4c0f2d33adf2719f5d36d1bc5"} Apr 17 16:54:13.657447 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.657251 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerStarted","Data":"526633195c05336fe51fc63eff883722aaa21ff364d7ecd200c8400e7679806c"} Apr 17 16:54:13.657447 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.657263 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerStarted","Data":"0ab3b6e1b9e55f69635b5120021552a13a1c66a956f8983e9f27e6152240b4ad"} Apr 17 16:54:13.657447 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.657274 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerStarted","Data":"91104cd868a3eba45b287504c66a7b84b09280f8069eb00ff24cb6a6282047ce"} Apr 17 16:54:13.658894 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.658868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-879b74458-h9jfj" event={"ID":"3fbfc290-6339-4c96-a76a-774c6a277a0e","Type":"ContainerStarted","Data":"ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630"} Apr 17 16:54:13.660364 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.660340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" event={"ID":"70e43a28-718b-4cd3-aa08-d04ad07a8213","Type":"ContainerStarted","Data":"a6b517627a0a06ababcf47b8b63d3a823350da14abffe0bbded8d10ce8a70a91"} Apr 17 16:54:13.689062 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.689001 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-879b74458-h9jfj" podStartSLOduration=1.6081470869999999 podStartE2EDuration="6.688982547s" podCreationTimestamp="2026-04-17 16:54:07 +0000 UTC" firstStartedPulling="2026-04-17 16:54:07.504716254 +0000 UTC m=+86.904144764" lastFinishedPulling="2026-04-17 16:54:12.585551711 +0000 UTC m=+91.984980224" observedRunningTime="2026-04-17 16:54:13.688050843 +0000 UTC m=+93.087479373" watchObservedRunningTime="2026-04-17 16:54:13.688982547 +0000 UTC m=+93.088411080" Apr 17 16:54:13.709869 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:13.709813 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" podStartSLOduration=2.541253447 podStartE2EDuration="6.709795499s" podCreationTimestamp="2026-04-17 16:54:07 +0000 UTC" firstStartedPulling="2026-04-17 16:54:08.417455907 +0000 UTC m=+87.816884418" lastFinishedPulling="2026-04-17 16:54:12.585997957 +0000 UTC m=+91.985426470" observedRunningTime="2026-04-17 16:54:13.70977553 +0000 UTC m=+93.109204063" watchObservedRunningTime="2026-04-17 16:54:13.709795499 +0000 UTC m=+93.109224032" Apr 17 16:54:14.029345 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:14.029315 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-879b74458-h9jfj"] Apr 17 16:54:15.675305 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:15.675250 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerStarted","Data":"62b138f32737ebb5a2be96ecfb38910ccfc154ebb03277dd694679286be2fb8d"} Apr 17 16:54:15.710169 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:15.710116 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.486311043 podStartE2EDuration="11.710100322s" podCreationTimestamp="2026-04-17 16:54:04 +0000 UTC" firstStartedPulling="2026-04-17 16:54:07.430048718 +0000 UTC m=+86.829477242" lastFinishedPulling="2026-04-17 16:54:14.653837992 +0000 UTC m=+94.053266521" observedRunningTime="2026-04-17 16:54:15.707238623 +0000 UTC m=+95.106667157" watchObservedRunningTime="2026-04-17 16:54:15.710100322 +0000 UTC m=+95.109528855" Apr 17 16:54:17.359536 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:17.359499 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:17.533140 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:17.533107 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pjg74" Apr 17 16:54:28.239777 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:28.239739 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:28.239777 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:28.239780 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:40.696864 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:40.696814 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-879b74458-h9jfj" podUID="3fbfc290-6339-4c96-a76a-774c6a277a0e" containerName="console" containerID="cri-o://ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630" gracePeriod=15 Apr 17 16:54:41.006013 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.005991 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-879b74458-h9jfj_3fbfc290-6339-4c96-a76a-774c6a277a0e/console/0.log" Apr 17 16:54:41.006129 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.006062 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:41.129251 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129219 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-oauth-serving-cert\") pod \"3fbfc290-6339-4c96-a76a-774c6a277a0e\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " Apr 17 16:54:41.129251 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129257 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-oauth-config\") pod \"3fbfc290-6339-4c96-a76a-774c6a277a0e\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " Apr 17 16:54:41.129492 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129285 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk6x5\" (UniqueName: \"kubernetes.io/projected/3fbfc290-6339-4c96-a76a-774c6a277a0e-kube-api-access-zk6x5\") pod \"3fbfc290-6339-4c96-a76a-774c6a277a0e\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " Apr 17 16:54:41.129492 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129305 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-service-ca\") pod \"3fbfc290-6339-4c96-a76a-774c6a277a0e\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " Apr 17 16:54:41.129492 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129342 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-trusted-ca-bundle\") pod \"3fbfc290-6339-4c96-a76a-774c6a277a0e\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " Apr 17 16:54:41.129492 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129358 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-serving-cert\") pod \"3fbfc290-6339-4c96-a76a-774c6a277a0e\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " Apr 17 16:54:41.129492 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129373 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-config\") pod \"3fbfc290-6339-4c96-a76a-774c6a277a0e\" (UID: \"3fbfc290-6339-4c96-a76a-774c6a277a0e\") " Apr 17 16:54:41.129780 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129609 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3fbfc290-6339-4c96-a76a-774c6a277a0e" (UID: "3fbfc290-6339-4c96-a76a-774c6a277a0e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:41.129833 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129811 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-service-ca" (OuterVolumeSpecName: "service-ca") pod "3fbfc290-6339-4c96-a76a-774c6a277a0e" (UID: "3fbfc290-6339-4c96-a76a-774c6a277a0e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:41.129890 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129847 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3fbfc290-6339-4c96-a76a-774c6a277a0e" (UID: "3fbfc290-6339-4c96-a76a-774c6a277a0e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:41.129890 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.129871 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-config" (OuterVolumeSpecName: "console-config") pod "3fbfc290-6339-4c96-a76a-774c6a277a0e" (UID: "3fbfc290-6339-4c96-a76a-774c6a277a0e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:54:41.131730 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.131703 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3fbfc290-6339-4c96-a76a-774c6a277a0e" (UID: "3fbfc290-6339-4c96-a76a-774c6a277a0e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:41.131860 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.131832 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3fbfc290-6339-4c96-a76a-774c6a277a0e" (UID: "3fbfc290-6339-4c96-a76a-774c6a277a0e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:41.131901 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.131860 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbfc290-6339-4c96-a76a-774c6a277a0e-kube-api-access-zk6x5" (OuterVolumeSpecName: "kube-api-access-zk6x5") pod "3fbfc290-6339-4c96-a76a-774c6a277a0e" (UID: "3fbfc290-6339-4c96-a76a-774c6a277a0e"). InnerVolumeSpecName "kube-api-access-zk6x5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:41.229996 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.229952 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zk6x5\" (UniqueName: \"kubernetes.io/projected/3fbfc290-6339-4c96-a76a-774c6a277a0e-kube-api-access-zk6x5\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:54:41.229996 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.229991 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-service-ca\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:54:41.229996 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.230001 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-trusted-ca-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:54:41.230227 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.230011 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-serving-cert\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:54:41.230227 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.230021 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-config\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:54:41.230227 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.230029 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3fbfc290-6339-4c96-a76a-774c6a277a0e-oauth-serving-cert\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:54:41.230227 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.230037 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3fbfc290-6339-4c96-a76a-774c6a277a0e-console-oauth-config\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:54:41.752754 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.752716 2574 generic.go:358] "Generic (PLEG): container finished" podID="3fbfc290-6339-4c96-a76a-774c6a277a0e" containerID="ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630" exitCode=2 Apr 17 16:54:41.753184 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.752777 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-879b74458-h9jfj" Apr 17 16:54:41.753184 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.752800 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-879b74458-h9jfj" event={"ID":"3fbfc290-6339-4c96-a76a-774c6a277a0e","Type":"ContainerDied","Data":"ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630"} Apr 17 16:54:41.753184 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.752832 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-879b74458-h9jfj" event={"ID":"3fbfc290-6339-4c96-a76a-774c6a277a0e","Type":"ContainerDied","Data":"311f1300c64b4a20f5bbb3e805f1eb0e701c27804a80dc77ec5dbda9fdcd33cf"} Apr 17 16:54:41.753184 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.752846 2574 scope.go:117] "RemoveContainer" containerID="ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630" Apr 17 16:54:41.754411 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.754391 2574 generic.go:358] "Generic (PLEG): container finished" podID="d8f8e345-311f-4173-a487-5da18bb5d557" containerID="623b702be3a8fc14f1945f7a18dd41d8b2d2e33f3b8408baa9c9a067d7736355" exitCode=0 Apr 17 16:54:41.754505 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.754425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" event={"ID":"d8f8e345-311f-4173-a487-5da18bb5d557","Type":"ContainerDied","Data":"623b702be3a8fc14f1945f7a18dd41d8b2d2e33f3b8408baa9c9a067d7736355"} Apr 17 16:54:41.754753 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.754735 2574 scope.go:117] "RemoveContainer" containerID="623b702be3a8fc14f1945f7a18dd41d8b2d2e33f3b8408baa9c9a067d7736355" Apr 17 16:54:41.760414 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.760399 2574 scope.go:117] "RemoveContainer" containerID="ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630" Apr 17 16:54:41.760635 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:54:41.760616 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630\": container with ID starting with ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630 not found: ID does not exist" containerID="ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630" Apr 17 16:54:41.760723 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.760644 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630"} err="failed to get container status \"ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630\": rpc error: code = NotFound desc = could not find container \"ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630\": container with ID starting with ca2b898ae0e427e66c4b16109daab66c7110d67ffd025fec3534f670583d9630 not found: ID does not exist" Apr 17 16:54:41.809629 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.809605 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-879b74458-h9jfj"] Apr 17 16:54:41.816387 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:41.816365 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-879b74458-h9jfj"] Apr 17 16:54:42.760077 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:42.760040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dxzpk" event={"ID":"d8f8e345-311f-4173-a487-5da18bb5d557","Type":"ContainerStarted","Data":"ed05b3f76dedf1f9dfe70ece1bd833feb3881fb3c647d0fee3e794016cd9314d"} Apr 17 16:54:43.167317 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:43.167276 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbfc290-6339-4c96-a76a-774c6a277a0e" path="/var/lib/kubelet/pods/3fbfc290-6339-4c96-a76a-774c6a277a0e/volumes" Apr 17 16:54:48.245359 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:48.245329 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:54:48.249188 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:54:48.249166 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-c9b8f5d79-jmwjl" Apr 17 16:55:27.750174 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.750089 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:55:27.750690 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.750503 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="alertmanager" containerID="cri-o://91104cd868a3eba45b287504c66a7b84b09280f8069eb00ff24cb6a6282047ce" gracePeriod=120 Apr 17 16:55:27.750690 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.750593 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy-metric" containerID="cri-o://c40e82cb609e4c047d5e9686d5be6696437b8c62db7250de9df23dfeeb31bfec" gracePeriod=120 Apr 17 16:55:27.750690 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.750620 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy-web" containerID="cri-o://526633195c05336fe51fc63eff883722aaa21ff364d7ecd200c8400e7679806c" gracePeriod=120 Apr 17 16:55:27.750690 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.750633 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="prom-label-proxy" containerID="cri-o://62b138f32737ebb5a2be96ecfb38910ccfc154ebb03277dd694679286be2fb8d" gracePeriod=120 Apr 17 16:55:27.750910 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.750701 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="config-reloader" containerID="cri-o://0ab3b6e1b9e55f69635b5120021552a13a1c66a956f8983e9f27e6152240b4ad" gracePeriod=120 Apr 17 16:55:27.750910 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.750639 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy" containerID="cri-o://92b3a44c4a71f9cb5ed7cec1668af9bc869b9eb4c0f2d33adf2719f5d36d1bc5" gracePeriod=120 Apr 17 16:55:27.891314 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891287 2574 generic.go:358] "Generic (PLEG): container finished" podID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerID="62b138f32737ebb5a2be96ecfb38910ccfc154ebb03277dd694679286be2fb8d" exitCode=0 Apr 17 16:55:27.891314 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891313 2574 generic.go:358] "Generic (PLEG): container finished" podID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerID="c40e82cb609e4c047d5e9686d5be6696437b8c62db7250de9df23dfeeb31bfec" exitCode=0 Apr 17 16:55:27.891449 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891320 2574 generic.go:358] "Generic (PLEG): container finished" podID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerID="92b3a44c4a71f9cb5ed7cec1668af9bc869b9eb4c0f2d33adf2719f5d36d1bc5" exitCode=0 Apr 17 16:55:27.891449 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891326 2574 generic.go:358] "Generic (PLEG): container finished" podID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerID="0ab3b6e1b9e55f69635b5120021552a13a1c66a956f8983e9f27e6152240b4ad" exitCode=0 Apr 17 16:55:27.891449 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891334 2574 generic.go:358] "Generic (PLEG): container finished" podID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerID="91104cd868a3eba45b287504c66a7b84b09280f8069eb00ff24cb6a6282047ce" exitCode=0 Apr 17 16:55:27.891449 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerDied","Data":"62b138f32737ebb5a2be96ecfb38910ccfc154ebb03277dd694679286be2fb8d"} Apr 17 16:55:27.891449 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerDied","Data":"c40e82cb609e4c047d5e9686d5be6696437b8c62db7250de9df23dfeeb31bfec"} Apr 17 16:55:27.891449 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerDied","Data":"92b3a44c4a71f9cb5ed7cec1668af9bc869b9eb4c0f2d33adf2719f5d36d1bc5"} Apr 17 16:55:27.891449 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerDied","Data":"0ab3b6e1b9e55f69635b5120021552a13a1c66a956f8983e9f27e6152240b4ad"} Apr 17 16:55:27.891449 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:27.891422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerDied","Data":"91104cd868a3eba45b287504c66a7b84b09280f8069eb00ff24cb6a6282047ce"} Apr 17 16:55:28.898560 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.898526 2574 generic.go:358] "Generic (PLEG): container finished" podID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerID="526633195c05336fe51fc63eff883722aaa21ff364d7ecd200c8400e7679806c" exitCode=0 Apr 17 16:55:28.899030 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.898583 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerDied","Data":"526633195c05336fe51fc63eff883722aaa21ff364d7ecd200c8400e7679806c"} Apr 17 16:55:28.989239 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.989221 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:28.995719 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.995698 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-volume\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.995791 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.995734 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-main-db\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.995791 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.995765 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-out\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.995906 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.995885 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.995960 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.995943 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcx9q\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-kube-api-access-fcx9q\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996000 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.995987 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996045 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996015 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-cluster-tls-config\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996097 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996044 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-tls-assets\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996097 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996063 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:28.996097 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996069 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-metrics-client-ca\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996249 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996127 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-main-tls\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996249 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996161 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996268 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-web\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996352 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996311 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-web-config\") pod \"f79711b8-86ad-4d30-9a85-627348a0e1f0\" (UID: \"f79711b8-86ad-4d30-9a85-627348a0e1f0\") " Apr 17 16:55:28.996469 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996383 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:55:28.996574 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996556 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-main-db\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:28.997351 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.996582 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-metrics-client-ca\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:28.997351 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.997003 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:55:28.998740 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.998711 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:28.998934 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.998903 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:28.999257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.999216 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-out" (OuterVolumeSpecName: "config-out") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:28.999257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.999216 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-volume" (OuterVolumeSpecName: "config-volume") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:28.999998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.999972 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:28.999998 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:28.999985 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:29.000709 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.000681 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:29.000935 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.000909 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-kube-api-access-fcx9q" (OuterVolumeSpecName: "kube-api-access-fcx9q") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "kube-api-access-fcx9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:29.004166 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.004139 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:29.009769 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.009626 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-web-config" (OuterVolumeSpecName: "web-config") pod "f79711b8-86ad-4d30-9a85-627348a0e1f0" (UID: "f79711b8-86ad-4d30-9a85-627348a0e1f0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:29.097446 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097371 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcx9q\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-kube-api-access-fcx9q\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097446 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097395 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097446 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097405 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-cluster-tls-config\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097446 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097416 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f79711b8-86ad-4d30-9a85-627348a0e1f0-tls-assets\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097446 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097426 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-main-tls\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097446 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097436 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f79711b8-86ad-4d30-9a85-627348a0e1f0-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097747 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097445 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097747 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097481 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-web-config\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097747 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097490 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-volume\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097747 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097498 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f79711b8-86ad-4d30-9a85-627348a0e1f0-config-out\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.097747 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.097506 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f79711b8-86ad-4d30-9a85-627348a0e1f0-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:55:29.904579 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.904541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f79711b8-86ad-4d30-9a85-627348a0e1f0","Type":"ContainerDied","Data":"caaaeff102e88e6f378a03f7ed4a9c458af496a18b2f27f0bae5135e37b3f1ab"} Apr 17 16:55:29.905041 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.904588 2574 scope.go:117] "RemoveContainer" containerID="62b138f32737ebb5a2be96ecfb38910ccfc154ebb03277dd694679286be2fb8d" Apr 17 16:55:29.905041 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.904675 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:29.911722 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.911703 2574 scope.go:117] "RemoveContainer" containerID="c40e82cb609e4c047d5e9686d5be6696437b8c62db7250de9df23dfeeb31bfec" Apr 17 16:55:29.917995 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.917981 2574 scope.go:117] "RemoveContainer" containerID="92b3a44c4a71f9cb5ed7cec1668af9bc869b9eb4c0f2d33adf2719f5d36d1bc5" Apr 17 16:55:29.923864 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.923847 2574 scope.go:117] "RemoveContainer" containerID="526633195c05336fe51fc63eff883722aaa21ff364d7ecd200c8400e7679806c" Apr 17 16:55:29.929201 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.929161 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:55:29.930565 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.930549 2574 scope.go:117] "RemoveContainer" containerID="0ab3b6e1b9e55f69635b5120021552a13a1c66a956f8983e9f27e6152240b4ad" Apr 17 16:55:29.932464 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.932443 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:55:29.937401 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.937384 2574 scope.go:117] "RemoveContainer" containerID="91104cd868a3eba45b287504c66a7b84b09280f8069eb00ff24cb6a6282047ce" Apr 17 16:55:29.943865 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.943850 2574 scope.go:117] "RemoveContainer" containerID="92fb099517a1674437ddb1ab45bcf7e1a8bb687c196104c02be6fd722acd96a5" Apr 17 16:55:29.968894 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.968872 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:55:29.969143 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969129 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy" Apr 17 16:55:29.969143 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969143 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969151 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="config-reloader" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969157 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="config-reloader" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969166 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fbfc290-6339-4c96-a76a-774c6a277a0e" containerName="console" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969171 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbfc290-6339-4c96-a76a-774c6a277a0e" containerName="console" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969181 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="prom-label-proxy" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969186 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="prom-label-proxy" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969194 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="init-config-reloader" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969199 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="init-config-reloader" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969208 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy-metric" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969213 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy-metric" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969222 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="alertmanager" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969227 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="alertmanager" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969235 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy-web" Apr 17 16:55:29.969257 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969240 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy-web" Apr 17 16:55:29.969702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969280 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy-metric" Apr 17 16:55:29.969702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969288 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fbfc290-6339-4c96-a76a-774c6a277a0e" containerName="console" Apr 17 16:55:29.969702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969294 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="prom-label-proxy" Apr 17 16:55:29.969702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969301 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="alertmanager" Apr 17 16:55:29.969702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969307 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy-web" Apr 17 16:55:29.969702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969313 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="kube-rbac-proxy" Apr 17 16:55:29.969702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.969319 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" containerName="config-reloader" Apr 17 16:55:29.974233 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.974220 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:29.976812 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.976783 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:55:29.976904 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.976855 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:55:29.976904 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.976856 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kjq2q\"" Apr 17 16:55:29.976996 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.976911 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:55:29.977059 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.977043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:55:29.977112 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.977094 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:55:29.977201 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.977184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:55:29.977600 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.977574 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:55:29.977713 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.977644 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:55:29.982054 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.982034 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:55:29.986381 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:29.986361 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:55:30.004068 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be9d9f7c-d9a5-40a6-b98d-398d14885410-config-out\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004168 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004075 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-config-volume\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004168 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004091 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004168 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004117 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004267 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004192 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be9d9f7c-d9a5-40a6-b98d-398d14885410-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004267 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9d9f7c-d9a5-40a6-b98d-398d14885410-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004267 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004247 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004361 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004265 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004361 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004326 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlh8k\" (UniqueName: \"kubernetes.io/projected/be9d9f7c-d9a5-40a6-b98d-398d14885410-kube-api-access-zlh8k\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004424 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004424 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004397 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be9d9f7c-d9a5-40a6-b98d-398d14885410-tls-assets\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004424 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-web-config\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.004508 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.004427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/be9d9f7c-d9a5-40a6-b98d-398d14885410-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105116 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be9d9f7c-d9a5-40a6-b98d-398d14885410-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105116 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9d9f7c-d9a5-40a6-b98d-398d14885410-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105116 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105116 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlh8k\" (UniqueName: \"kubernetes.io/projected/be9d9f7c-d9a5-40a6-b98d-398d14885410-kube-api-access-zlh8k\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be9d9f7c-d9a5-40a6-b98d-398d14885410-tls-assets\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-web-config\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/be9d9f7c-d9a5-40a6-b98d-398d14885410-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be9d9f7c-d9a5-40a6-b98d-398d14885410-config-out\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-config-volume\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105346 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.105930 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.105907 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/be9d9f7c-d9a5-40a6-b98d-398d14885410-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.106738 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.106673 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/be9d9f7c-d9a5-40a6-b98d-398d14885410-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.106884 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.106808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be9d9f7c-d9a5-40a6-b98d-398d14885410-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.108587 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.108544 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.108739 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.108714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.108803 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.108762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/be9d9f7c-d9a5-40a6-b98d-398d14885410-config-out\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.108962 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.108943 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-web-config\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.109049 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.108967 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/be9d9f7c-d9a5-40a6-b98d-398d14885410-tls-assets\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.109282 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.109261 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-config-volume\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.109427 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.109410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.109825 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.109808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.110082 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.110064 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/be9d9f7c-d9a5-40a6-b98d-398d14885410-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.113817 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.113797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlh8k\" (UniqueName: \"kubernetes.io/projected/be9d9f7c-d9a5-40a6-b98d-398d14885410-kube-api-access-zlh8k\") pod \"alertmanager-main-0\" (UID: \"be9d9f7c-d9a5-40a6-b98d-398d14885410\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.283470 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.283424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:55:30.407952 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.407923 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:55:30.410451 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:55:30.410417 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe9d9f7c_d9a5_40a6_b98d_398d14885410.slice/crio-18a0caf070ae0f5bfe852833177ea59d4502ee0a8d6aac61ac08e783a6242669 WatchSource:0}: Error finding container 18a0caf070ae0f5bfe852833177ea59d4502ee0a8d6aac61ac08e783a6242669: Status 404 returned error can't find the container with id 18a0caf070ae0f5bfe852833177ea59d4502ee0a8d6aac61ac08e783a6242669 Apr 17 16:55:30.908931 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.908901 2574 generic.go:358] "Generic (PLEG): container finished" podID="be9d9f7c-d9a5-40a6-b98d-398d14885410" containerID="9ed5173e3e6d46e7d4c6de1ad6d09dd3aba4dab72de7a4e0ac7a6c008c23d2c5" exitCode=0 Apr 17 16:55:30.909349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.908996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be9d9f7c-d9a5-40a6-b98d-398d14885410","Type":"ContainerDied","Data":"9ed5173e3e6d46e7d4c6de1ad6d09dd3aba4dab72de7a4e0ac7a6c008c23d2c5"} Apr 17 16:55:30.909349 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:30.909031 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be9d9f7c-d9a5-40a6-b98d-398d14885410","Type":"ContainerStarted","Data":"18a0caf070ae0f5bfe852833177ea59d4502ee0a8d6aac61ac08e783a6242669"} Apr 17 16:55:31.171120 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.171092 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f79711b8-86ad-4d30-9a85-627348a0e1f0" path="/var/lib/kubelet/pods/f79711b8-86ad-4d30-9a85-627348a0e1f0/volumes" Apr 17 16:55:31.801572 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.801540 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-8c79fc944-t9kcw"] Apr 17 16:55:31.804938 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.804917 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.807711 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.807625 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-wnvd7\"" Apr 17 16:55:31.807711 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.807691 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 16:55:31.807711 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.807698 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 16:55:31.807892 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.807772 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 16:55:31.807892 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.807649 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 16:55:31.808090 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.808074 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 16:55:31.812909 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.812890 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 16:55:31.816754 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.816723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.816841 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.816778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-telemeter-client-tls\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.816841 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.816815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn5m4\" (UniqueName: \"kubernetes.io/projected/c7336f88-2edc-4d7e-a47a-10acf9c37205-kube-api-access-mn5m4\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.816959 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.816841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-secret-telemeter-client\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.816959 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.816897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-federate-client-tls\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.816959 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.816936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.817132 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.816976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-serving-certs-ca-bundle\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.817132 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.816999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-metrics-client-ca\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.818264 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.818242 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8c79fc944-t9kcw"] Apr 17 16:55:31.916249 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.916214 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be9d9f7c-d9a5-40a6-b98d-398d14885410","Type":"ContainerStarted","Data":"3d2daf680513299ce69a48dd0f234f30e0a7144d7776aa60e2e79a3b7f555e9f"} Apr 17 16:55:31.916249 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.916254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be9d9f7c-d9a5-40a6-b98d-398d14885410","Type":"ContainerStarted","Data":"4a057085fa32488145c085cb29de9d5a1eba6b6e6ff6811bac5bea66c774ad83"} Apr 17 16:55:31.916722 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.916268 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be9d9f7c-d9a5-40a6-b98d-398d14885410","Type":"ContainerStarted","Data":"ea944cae6a58171004af4714e32b2ea5b2a0034c696ee44bcb98b5f1debd6956"} Apr 17 16:55:31.916722 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.916279 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be9d9f7c-d9a5-40a6-b98d-398d14885410","Type":"ContainerStarted","Data":"125708e737bba3c4b86b1522a9157145a98c8c456f5c843d649feb41d4633ebc"} Apr 17 16:55:31.916722 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.916290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be9d9f7c-d9a5-40a6-b98d-398d14885410","Type":"ContainerStarted","Data":"57ed6367662e03579a67b9ed70d32b26cee3a6d7d1a417210fc52ad7274b07ec"} Apr 17 16:55:31.916722 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.916300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"be9d9f7c-d9a5-40a6-b98d-398d14885410","Type":"ContainerStarted","Data":"860bd073fc5de225f99916d81b950dbcb392d6a0cd75fe7c55076b6f9ff175ba"} Apr 17 16:55:31.917749 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.917729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-telemeter-client-tls\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.917807 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.917761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn5m4\" (UniqueName: \"kubernetes.io/projected/c7336f88-2edc-4d7e-a47a-10acf9c37205-kube-api-access-mn5m4\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.917807 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.917781 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-secret-telemeter-client\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.917993 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.917972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-federate-client-tls\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.918061 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.918044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.918112 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.918096 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-serving-certs-ca-bundle\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.918162 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.918132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-metrics-client-ca\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.918214 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.918200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.918871 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.918829 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-serving-certs-ca-bundle\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.919114 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.919090 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-telemeter-trusted-ca-bundle\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.919212 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.919184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7336f88-2edc-4d7e-a47a-10acf9c37205-metrics-client-ca\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.920708 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.920686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-federate-client-tls\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.920794 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.920771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.920891 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.920875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-secret-telemeter-client\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.921133 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.921113 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/c7336f88-2edc-4d7e-a47a-10acf9c37205-telemeter-client-tls\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.926161 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.926142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn5m4\" (UniqueName: \"kubernetes.io/projected/c7336f88-2edc-4d7e-a47a-10acf9c37205-kube-api-access-mn5m4\") pod \"telemeter-client-8c79fc944-t9kcw\" (UID: \"c7336f88-2edc-4d7e-a47a-10acf9c37205\") " pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:31.942636 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:31.942599 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.9425879889999997 podStartE2EDuration="2.942587989s" podCreationTimestamp="2026-04-17 16:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:55:31.941716445 +0000 UTC m=+171.341144971" watchObservedRunningTime="2026-04-17 16:55:31.942587989 +0000 UTC m=+171.342016552" Apr 17 16:55:32.122227 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:32.122153 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" Apr 17 16:55:32.249269 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:32.249246 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-8c79fc944-t9kcw"] Apr 17 16:55:32.250944 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:55:32.250919 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7336f88_2edc_4d7e_a47a_10acf9c37205.slice/crio-5e61784af5e527c8198a68fc0d03cefd90a16201c32b2e94a977b0dfd58f9bba WatchSource:0}: Error finding container 5e61784af5e527c8198a68fc0d03cefd90a16201c32b2e94a977b0dfd58f9bba: Status 404 returned error can't find the container with id 5e61784af5e527c8198a68fc0d03cefd90a16201c32b2e94a977b0dfd58f9bba Apr 17 16:55:32.922411 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:32.922371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" event={"ID":"c7336f88-2edc-4d7e-a47a-10acf9c37205","Type":"ContainerStarted","Data":"5e61784af5e527c8198a68fc0d03cefd90a16201c32b2e94a977b0dfd58f9bba"} Apr 17 16:55:34.929887 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:34.929836 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" event={"ID":"c7336f88-2edc-4d7e-a47a-10acf9c37205","Type":"ContainerStarted","Data":"f74ad135496ae1d46fc458998c3e995d39675e70aaac6a2d8408bd7b3edda181"} Apr 17 16:55:34.929887 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:34.929884 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" event={"ID":"c7336f88-2edc-4d7e-a47a-10acf9c37205","Type":"ContainerStarted","Data":"177395524b8d0c39bf288a52ad523b35aad678e31cdf1cf7e658decfd5e3486c"} Apr 17 16:55:34.929887 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:34.929894 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" event={"ID":"c7336f88-2edc-4d7e-a47a-10acf9c37205","Type":"ContainerStarted","Data":"fdf13c5ef6a6956b16cf8cce218fc1d2125af8f5da3e7d4a657432f2bb76c195"} Apr 17 16:55:34.953275 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:34.953109 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-8c79fc944-t9kcw" podStartSLOduration=2.311497005 podStartE2EDuration="3.953095299s" podCreationTimestamp="2026-04-17 16:55:31 +0000 UTC" firstStartedPulling="2026-04-17 16:55:32.252783913 +0000 UTC m=+171.652212427" lastFinishedPulling="2026-04-17 16:55:33.894382207 +0000 UTC m=+173.293810721" observedRunningTime="2026-04-17 16:55:34.952338073 +0000 UTC m=+174.351766604" watchObservedRunningTime="2026-04-17 16:55:34.953095299 +0000 UTC m=+174.352523831" Apr 17 16:55:35.690270 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.690228 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84b778c6c4-9q27q"] Apr 17 16:55:35.693812 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.693766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.696757 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.696725 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:55:35.697747 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.697695 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:55:35.697747 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.697709 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:55:35.698076 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.698054 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:55:35.698232 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.698057 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:55:35.698445 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.698421 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-h7h8x\"" Apr 17 16:55:35.702383 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.702360 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:55:35.705781 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.705758 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84b778c6c4-9q27q"] Apr 17 16:55:35.749835 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.749803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-oauth-serving-cert\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.749835 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.749853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz7bx\" (UniqueName: \"kubernetes.io/projected/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-kube-api-access-rz7bx\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.750055 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.749907 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-config\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.750055 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.749925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-trusted-ca-bundle\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.750055 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.750001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-oauth-config\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.750055 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.750034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-serving-cert\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.750055 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.750052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-service-ca\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.850894 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.850852 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-oauth-config\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.850894 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.850899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-serving-cert\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851127 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.850925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-service-ca\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851127 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.850969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-oauth-serving-cert\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851127 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.850994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rz7bx\" (UniqueName: \"kubernetes.io/projected/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-kube-api-access-rz7bx\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851272 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.851119 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-config\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851272 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.851159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-trusted-ca-bundle\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851799 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.851777 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-service-ca\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.851813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-oauth-serving-cert\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.851838 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-config\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.851925 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.851910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-trusted-ca-bundle\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.853768 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.853744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-serving-cert\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.853855 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.853747 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-oauth-config\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:35.860681 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:35.860631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz7bx\" (UniqueName: \"kubernetes.io/projected/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-kube-api-access-rz7bx\") pod \"console-84b778c6c4-9q27q\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:36.005829 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:36.005739 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:36.131827 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:36.131804 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84b778c6c4-9q27q"] Apr 17 16:55:36.134081 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:55:36.134059 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42dac430_abf3_4cd4_bec7_33ed8b1c5ade.slice/crio-963d7d1f722ac820dea4a3aae6f7bb73c24558d36fad7cd521438ec04c4931d2 WatchSource:0}: Error finding container 963d7d1f722ac820dea4a3aae6f7bb73c24558d36fad7cd521438ec04c4931d2: Status 404 returned error can't find the container with id 963d7d1f722ac820dea4a3aae6f7bb73c24558d36fad7cd521438ec04c4931d2 Apr 17 16:55:36.938277 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:36.938243 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b778c6c4-9q27q" event={"ID":"42dac430-abf3-4cd4-bec7-33ed8b1c5ade","Type":"ContainerStarted","Data":"44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52"} Apr 17 16:55:36.938277 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:36.938278 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b778c6c4-9q27q" event={"ID":"42dac430-abf3-4cd4-bec7-33ed8b1c5ade","Type":"ContainerStarted","Data":"963d7d1f722ac820dea4a3aae6f7bb73c24558d36fad7cd521438ec04c4931d2"} Apr 17 16:55:36.956378 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:36.956334 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84b778c6c4-9q27q" podStartSLOduration=1.956323531 podStartE2EDuration="1.956323531s" podCreationTimestamp="2026-04-17 16:55:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:55:36.954628272 +0000 UTC m=+176.354056807" watchObservedRunningTime="2026-04-17 16:55:36.956323531 +0000 UTC m=+176.355752063" Apr 17 16:55:46.006504 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:46.006468 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:46.006504 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:46.006514 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:46.011377 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:46.011351 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:55:46.973137 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:55:46.973109 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:56:46.393597 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.393563 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c5f89984-bqc4r"] Apr 17 16:56:46.398113 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.398091 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.408161 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.408133 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c5f89984-bqc4r"] Apr 17 16:56:46.527880 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.527840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-serving-cert\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.528038 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.527913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-oauth-config\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.528038 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.527955 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-service-ca\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.528038 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.527985 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-trusted-ca-bundle\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.528038 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.528002 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-config\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.528038 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.528037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-769qt\" (UniqueName: \"kubernetes.io/projected/a0e367d4-6a64-4dbd-aa5d-511e12497f79-kube-api-access-769qt\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.528198 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.528081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-oauth-serving-cert\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.629457 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.629422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-769qt\" (UniqueName: \"kubernetes.io/projected/a0e367d4-6a64-4dbd-aa5d-511e12497f79-kube-api-access-769qt\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.629457 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.629461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-oauth-serving-cert\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.629695 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.629587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-serving-cert\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.629695 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.629668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-oauth-config\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.629810 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.629713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-service-ca\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.629810 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.629752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-trusted-ca-bundle\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.629810 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.629779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-config\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.630261 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.630228 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-oauth-serving-cert\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.630379 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.630348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-service-ca\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.630637 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.630613 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-config\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.630767 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.630686 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-trusted-ca-bundle\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.632407 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.632387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-serving-cert\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.632478 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.632385 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-oauth-config\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.637690 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.637648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-769qt\" (UniqueName: \"kubernetes.io/projected/a0e367d4-6a64-4dbd-aa5d-511e12497f79-kube-api-access-769qt\") pod \"console-6c5f89984-bqc4r\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.710511 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.710427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:46.838166 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:46.838084 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c5f89984-bqc4r"] Apr 17 16:56:46.840345 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:56:46.840305 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0e367d4_6a64_4dbd_aa5d_511e12497f79.slice/crio-a23963ea41c7db66c9da0fbcde418425d7a9b67f69232de52e4ee429362eb71b WatchSource:0}: Error finding container a23963ea41c7db66c9da0fbcde418425d7a9b67f69232de52e4ee429362eb71b: Status 404 returned error can't find the container with id a23963ea41c7db66c9da0fbcde418425d7a9b67f69232de52e4ee429362eb71b Apr 17 16:56:47.138592 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:47.138558 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c5f89984-bqc4r" event={"ID":"a0e367d4-6a64-4dbd-aa5d-511e12497f79","Type":"ContainerStarted","Data":"a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13"} Apr 17 16:56:47.138592 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:47.138599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c5f89984-bqc4r" event={"ID":"a0e367d4-6a64-4dbd-aa5d-511e12497f79","Type":"ContainerStarted","Data":"a23963ea41c7db66c9da0fbcde418425d7a9b67f69232de52e4ee429362eb71b"} Apr 17 16:56:47.156463 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:47.156419 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c5f89984-bqc4r" podStartSLOduration=1.156405436 podStartE2EDuration="1.156405436s" podCreationTimestamp="2026-04-17 16:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:56:47.155292601 +0000 UTC m=+246.554721133" watchObservedRunningTime="2026-04-17 16:56:47.156405436 +0000 UTC m=+246.555833967" Apr 17 16:56:56.711162 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:56.711121 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:56.711162 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:56.711166 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:56.715903 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:56.715883 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:57.170141 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:57.170113 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 16:56:57.216841 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:56:57.216811 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84b778c6c4-9q27q"] Apr 17 16:57:20.025702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.025651 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vdlnl"] Apr 17 16:57:20.028981 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.028962 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.031562 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.031542 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:57:20.037854 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.037825 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vdlnl"] Apr 17 16:57:20.207274 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.207239 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b872dd8-6b64-4209-aecb-57e4459eea02-dbus\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.207469 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.207368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b872dd8-6b64-4209-aecb-57e4459eea02-kubelet-config\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.207469 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.207407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b872dd8-6b64-4209-aecb-57e4459eea02-original-pull-secret\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.308191 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.308103 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b872dd8-6b64-4209-aecb-57e4459eea02-dbus\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.308191 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.308169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b872dd8-6b64-4209-aecb-57e4459eea02-kubelet-config\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.308191 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.308194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b872dd8-6b64-4209-aecb-57e4459eea02-original-pull-secret\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.308410 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.308299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7b872dd8-6b64-4209-aecb-57e4459eea02-dbus\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.308410 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.308298 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7b872dd8-6b64-4209-aecb-57e4459eea02-kubelet-config\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.310595 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.310568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7b872dd8-6b64-4209-aecb-57e4459eea02-original-pull-secret\") pod \"global-pull-secret-syncer-vdlnl\" (UID: \"7b872dd8-6b64-4209-aecb-57e4459eea02\") " pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.339310 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.339281 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vdlnl" Apr 17 16:57:20.456531 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:20.456508 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vdlnl"] Apr 17 16:57:20.459092 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:57:20.459066 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b872dd8_6b64_4209_aecb_57e4459eea02.slice/crio-30c6e49706e1d7dc73a419ce69ade303d3ee0fad5d8c60808b83137cf2efff85 WatchSource:0}: Error finding container 30c6e49706e1d7dc73a419ce69ade303d3ee0fad5d8c60808b83137cf2efff85: Status 404 returned error can't find the container with id 30c6e49706e1d7dc73a419ce69ade303d3ee0fad5d8c60808b83137cf2efff85 Apr 17 16:57:21.240642 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:21.240603 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vdlnl" event={"ID":"7b872dd8-6b64-4209-aecb-57e4459eea02","Type":"ContainerStarted","Data":"30c6e49706e1d7dc73a419ce69ade303d3ee0fad5d8c60808b83137cf2efff85"} Apr 17 16:57:22.238599 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.238524 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84b778c6c4-9q27q" podUID="42dac430-abf3-4cd4-bec7-33ed8b1c5ade" containerName="console" containerID="cri-o://44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52" gracePeriod=15 Apr 17 16:57:22.500491 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.500428 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84b778c6c4-9q27q_42dac430-abf3-4cd4-bec7-33ed8b1c5ade/console/0.log" Apr 17 16:57:22.500839 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.500507 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:57:22.631868 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.631838 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz7bx\" (UniqueName: \"kubernetes.io/projected/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-kube-api-access-rz7bx\") pod \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " Apr 17 16:57:22.631868 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.631882 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-serving-cert\") pod \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " Apr 17 16:57:22.632091 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.631929 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-oauth-serving-cert\") pod \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " Apr 17 16:57:22.632091 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.631994 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-oauth-config\") pod \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " Apr 17 16:57:22.632091 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632027 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-trusted-ca-bundle\") pod \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " Apr 17 16:57:22.632091 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632057 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-config\") pod \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " Apr 17 16:57:22.632336 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632305 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-service-ca\") pod \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\" (UID: \"42dac430-abf3-4cd4-bec7-33ed8b1c5ade\") " Apr 17 16:57:22.632438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632386 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "42dac430-abf3-4cd4-bec7-33ed8b1c5ade" (UID: "42dac430-abf3-4cd4-bec7-33ed8b1c5ade"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:57:22.632438 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632408 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-config" (OuterVolumeSpecName: "console-config") pod "42dac430-abf3-4cd4-bec7-33ed8b1c5ade" (UID: "42dac430-abf3-4cd4-bec7-33ed8b1c5ade"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:57:22.632561 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632497 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "42dac430-abf3-4cd4-bec7-33ed8b1c5ade" (UID: "42dac430-abf3-4cd4-bec7-33ed8b1c5ade"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:57:22.632880 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632855 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-service-ca" (OuterVolumeSpecName: "service-ca") pod "42dac430-abf3-4cd4-bec7-33ed8b1c5ade" (UID: "42dac430-abf3-4cd4-bec7-33ed8b1c5ade"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:57:22.633006 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632963 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-trusted-ca-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:57:22.633006 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.632988 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-config\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:57:22.633006 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.633004 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-oauth-serving-cert\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:57:22.634692 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.634628 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "42dac430-abf3-4cd4-bec7-33ed8b1c5ade" (UID: "42dac430-abf3-4cd4-bec7-33ed8b1c5ade"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:57:22.634813 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.634782 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-kube-api-access-rz7bx" (OuterVolumeSpecName: "kube-api-access-rz7bx") pod "42dac430-abf3-4cd4-bec7-33ed8b1c5ade" (UID: "42dac430-abf3-4cd4-bec7-33ed8b1c5ade"). InnerVolumeSpecName "kube-api-access-rz7bx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:57:22.634873 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.634807 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "42dac430-abf3-4cd4-bec7-33ed8b1c5ade" (UID: "42dac430-abf3-4cd4-bec7-33ed8b1c5ade"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:57:22.734128 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.734096 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-oauth-config\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:57:22.734128 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.734130 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-service-ca\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:57:22.734351 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.734151 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rz7bx\" (UniqueName: \"kubernetes.io/projected/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-kube-api-access-rz7bx\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:57:22.734351 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:22.734166 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42dac430-abf3-4cd4-bec7-33ed8b1c5ade-console-serving-cert\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:57:23.248416 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.248383 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84b778c6c4-9q27q_42dac430-abf3-4cd4-bec7-33ed8b1c5ade/console/0.log" Apr 17 16:57:23.248577 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.248440 2574 generic.go:358] "Generic (PLEG): container finished" podID="42dac430-abf3-4cd4-bec7-33ed8b1c5ade" containerID="44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52" exitCode=2 Apr 17 16:57:23.248577 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.248491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b778c6c4-9q27q" event={"ID":"42dac430-abf3-4cd4-bec7-33ed8b1c5ade","Type":"ContainerDied","Data":"44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52"} Apr 17 16:57:23.248577 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.248529 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b778c6c4-9q27q" Apr 17 16:57:23.248577 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.248540 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b778c6c4-9q27q" event={"ID":"42dac430-abf3-4cd4-bec7-33ed8b1c5ade","Type":"ContainerDied","Data":"963d7d1f722ac820dea4a3aae6f7bb73c24558d36fad7cd521438ec04c4931d2"} Apr 17 16:57:23.248577 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.248561 2574 scope.go:117] "RemoveContainer" containerID="44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52" Apr 17 16:57:23.267176 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.267153 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84b778c6c4-9q27q"] Apr 17 16:57:23.270311 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.270289 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84b778c6c4-9q27q"] Apr 17 16:57:23.868166 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.868143 2574 scope.go:117] "RemoveContainer" containerID="44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52" Apr 17 16:57:23.868514 ip-10-0-138-47 kubenswrapper[2574]: E0417 16:57:23.868494 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52\": container with ID starting with 44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52 not found: ID does not exist" containerID="44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52" Apr 17 16:57:23.868558 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:23.868527 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52"} err="failed to get container status \"44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52\": rpc error: code = NotFound desc = could not find container \"44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52\": container with ID starting with 44a7cc21953ecab734f30b0b6d574707e4b34f6b06e4b821fa18aa5b67700d52 not found: ID does not exist" Apr 17 16:57:24.253094 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:24.253000 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vdlnl" event={"ID":"7b872dd8-6b64-4209-aecb-57e4459eea02","Type":"ContainerStarted","Data":"9af80d4e34614083e0bcdc9e3edae6e4b3db7c2c70e45d96c1008630657d142c"} Apr 17 16:57:24.268279 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:24.268235 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vdlnl" podStartSLOduration=0.807254925 podStartE2EDuration="4.268222347s" podCreationTimestamp="2026-04-17 16:57:20 +0000 UTC" firstStartedPulling="2026-04-17 16:57:20.460739913 +0000 UTC m=+279.860168426" lastFinishedPulling="2026-04-17 16:57:23.921707338 +0000 UTC m=+283.321135848" observedRunningTime="2026-04-17 16:57:24.26705177 +0000 UTC m=+283.666480302" watchObservedRunningTime="2026-04-17 16:57:24.268222347 +0000 UTC m=+283.667650878" Apr 17 16:57:25.166643 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:25.166607 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42dac430-abf3-4cd4-bec7-33ed8b1c5ade" path="/var/lib/kubelet/pods/42dac430-abf3-4cd4-bec7-33ed8b1c5ade/volumes" Apr 17 16:57:41.036904 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:41.036873 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 16:57:41.037722 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:41.037702 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 16:57:41.046570 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:41.046552 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:57:44.229147 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.229112 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96"] Apr 17 16:57:44.231489 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.229452 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42dac430-abf3-4cd4-bec7-33ed8b1c5ade" containerName="console" Apr 17 16:57:44.231489 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.229465 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="42dac430-abf3-4cd4-bec7-33ed8b1c5ade" containerName="console" Apr 17 16:57:44.231489 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.229519 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="42dac430-abf3-4cd4-bec7-33ed8b1c5ade" containerName="console" Apr 17 16:57:44.232407 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.232391 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.235165 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.235144 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:57:44.236315 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.236300 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:57:44.236368 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.236323 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-wj58v\"" Apr 17 16:57:44.243315 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.243295 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96"] Apr 17 16:57:44.303762 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.303727 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.303922 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.303770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.303922 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.303795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94ts5\" (UniqueName: \"kubernetes.io/projected/009391b8-fb81-4d6b-b154-f91b2ee443d7-kube-api-access-94ts5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.405155 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.405124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.405314 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.405173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.405314 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.405198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94ts5\" (UniqueName: \"kubernetes.io/projected/009391b8-fb81-4d6b-b154-f91b2ee443d7-kube-api-access-94ts5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.405515 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.405495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.405562 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.405518 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.414134 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.414109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94ts5\" (UniqueName: \"kubernetes.io/projected/009391b8-fb81-4d6b-b154-f91b2ee443d7-kube-api-access-94ts5\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.541126 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.541096 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:57:44.662323 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.662130 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96"] Apr 17 16:57:44.664314 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:57:44.664285 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod009391b8_fb81_4d6b_b154_f91b2ee443d7.slice/crio-e7430d0b79a9eee8d6911af1a40c9e7b334269899cef8fd608afde43a5782f82 WatchSource:0}: Error finding container e7430d0b79a9eee8d6911af1a40c9e7b334269899cef8fd608afde43a5782f82: Status 404 returned error can't find the container with id e7430d0b79a9eee8d6911af1a40c9e7b334269899cef8fd608afde43a5782f82 Apr 17 16:57:44.666248 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:44.666234 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:57:45.317870 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:45.317821 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" event={"ID":"009391b8-fb81-4d6b-b154-f91b2ee443d7","Type":"ContainerStarted","Data":"e7430d0b79a9eee8d6911af1a40c9e7b334269899cef8fd608afde43a5782f82"} Apr 17 16:57:50.335041 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:50.335004 2574 generic.go:358] "Generic (PLEG): container finished" podID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerID="9e0b9aaacb65e5e57cef92a8ecdd62a7a4b6e1bda5c8e7344ed84b1e071354bd" exitCode=0 Apr 17 16:57:50.335509 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:50.335093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" event={"ID":"009391b8-fb81-4d6b-b154-f91b2ee443d7","Type":"ContainerDied","Data":"9e0b9aaacb65e5e57cef92a8ecdd62a7a4b6e1bda5c8e7344ed84b1e071354bd"} Apr 17 16:57:53.345729 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:53.345695 2574 generic.go:358] "Generic (PLEG): container finished" podID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerID="2c42a57daae06a552988d9a25f34bb267d02889ebac7f6e4a2b91cfe20f4a0d7" exitCode=0 Apr 17 16:57:53.346121 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:57:53.345782 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" event={"ID":"009391b8-fb81-4d6b-b154-f91b2ee443d7","Type":"ContainerDied","Data":"2c42a57daae06a552988d9a25f34bb267d02889ebac7f6e4a2b91cfe20f4a0d7"} Apr 17 16:58:01.372010 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:01.371973 2574 generic.go:358] "Generic (PLEG): container finished" podID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerID="cff036eea0f68151b453f58067572b2267bdd3b14cbd93889bf6bfe043e73022" exitCode=0 Apr 17 16:58:01.372440 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:01.372060 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" event={"ID":"009391b8-fb81-4d6b-b154-f91b2ee443d7","Type":"ContainerDied","Data":"cff036eea0f68151b453f58067572b2267bdd3b14cbd93889bf6bfe043e73022"} Apr 17 16:58:02.493188 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.493164 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:58:02.568764 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.568733 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-bundle\") pod \"009391b8-fb81-4d6b-b154-f91b2ee443d7\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " Apr 17 16:58:02.568764 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.568767 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-util\") pod \"009391b8-fb81-4d6b-b154-f91b2ee443d7\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " Apr 17 16:58:02.568987 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.568800 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94ts5\" (UniqueName: \"kubernetes.io/projected/009391b8-fb81-4d6b-b154-f91b2ee443d7-kube-api-access-94ts5\") pod \"009391b8-fb81-4d6b-b154-f91b2ee443d7\" (UID: \"009391b8-fb81-4d6b-b154-f91b2ee443d7\") " Apr 17 16:58:02.569346 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.569320 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-bundle" (OuterVolumeSpecName: "bundle") pod "009391b8-fb81-4d6b-b154-f91b2ee443d7" (UID: "009391b8-fb81-4d6b-b154-f91b2ee443d7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:02.571097 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.571073 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009391b8-fb81-4d6b-b154-f91b2ee443d7-kube-api-access-94ts5" (OuterVolumeSpecName: "kube-api-access-94ts5") pod "009391b8-fb81-4d6b-b154-f91b2ee443d7" (UID: "009391b8-fb81-4d6b-b154-f91b2ee443d7"). InnerVolumeSpecName "kube-api-access-94ts5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:58:02.572574 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.572556 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-util" (OuterVolumeSpecName: "util") pod "009391b8-fb81-4d6b-b154-f91b2ee443d7" (UID: "009391b8-fb81-4d6b-b154-f91b2ee443d7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:02.670454 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.670369 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:58:02.670454 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.670399 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/009391b8-fb81-4d6b-b154-f91b2ee443d7-util\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:58:02.670454 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:02.670409 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94ts5\" (UniqueName: \"kubernetes.io/projected/009391b8-fb81-4d6b-b154-f91b2ee443d7-kube-api-access-94ts5\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 16:58:03.379771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:03.379734 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" event={"ID":"009391b8-fb81-4d6b-b154-f91b2ee443d7","Type":"ContainerDied","Data":"e7430d0b79a9eee8d6911af1a40c9e7b334269899cef8fd608afde43a5782f82"} Apr 17 16:58:03.379771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:03.379767 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7430d0b79a9eee8d6911af1a40c9e7b334269899cef8fd608afde43a5782f82" Apr 17 16:58:03.379771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:03.379767 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5p2h96" Apr 17 16:58:13.740441 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.740408 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2479v"] Apr 17 16:58:13.740921 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.740720 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerName="pull" Apr 17 16:58:13.740921 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.740733 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerName="pull" Apr 17 16:58:13.740921 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.740745 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerName="extract" Apr 17 16:58:13.740921 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.740751 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerName="extract" Apr 17 16:58:13.740921 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.740762 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerName="util" Apr 17 16:58:13.740921 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.740767 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerName="util" Apr 17 16:58:13.740921 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.740812 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="009391b8-fb81-4d6b-b154-f91b2ee443d7" containerName="extract" Apr 17 16:58:13.745962 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.745945 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:13.748563 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.748540 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 16:58:13.749687 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.749646 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 16:58:13.749769 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.749647 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-vdwnw\"" Apr 17 16:58:13.754142 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.754120 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2479v"] Apr 17 16:58:13.861870 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.861838 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szn2\" (UniqueName: \"kubernetes.io/projected/5ae0eb97-8439-426c-a6e0-6ab39f54aca4-kube-api-access-7szn2\") pod \"cert-manager-webhook-597b96b99b-2479v\" (UID: \"5ae0eb97-8439-426c-a6e0-6ab39f54aca4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:13.862042 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.861893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ae0eb97-8439-426c-a6e0-6ab39f54aca4-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2479v\" (UID: \"5ae0eb97-8439-426c-a6e0-6ab39f54aca4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:13.963193 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.963157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ae0eb97-8439-426c-a6e0-6ab39f54aca4-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2479v\" (UID: \"5ae0eb97-8439-426c-a6e0-6ab39f54aca4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:13.963339 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.963266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7szn2\" (UniqueName: \"kubernetes.io/projected/5ae0eb97-8439-426c-a6e0-6ab39f54aca4-kube-api-access-7szn2\") pod \"cert-manager-webhook-597b96b99b-2479v\" (UID: \"5ae0eb97-8439-426c-a6e0-6ab39f54aca4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:13.972036 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.972015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5ae0eb97-8439-426c-a6e0-6ab39f54aca4-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-2479v\" (UID: \"5ae0eb97-8439-426c-a6e0-6ab39f54aca4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:13.972202 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:13.972184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szn2\" (UniqueName: \"kubernetes.io/projected/5ae0eb97-8439-426c-a6e0-6ab39f54aca4-kube-api-access-7szn2\") pod \"cert-manager-webhook-597b96b99b-2479v\" (UID: \"5ae0eb97-8439-426c-a6e0-6ab39f54aca4\") " pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:14.066478 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:14.066447 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:14.196956 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:14.196933 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-2479v"] Apr 17 16:58:14.199571 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:58:14.199542 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ae0eb97_8439_426c_a6e0_6ab39f54aca4.slice/crio-8fde1fafbcfe8f8c538a6aad13cff65a07e4059cadaa57032084b2c667219528 WatchSource:0}: Error finding container 8fde1fafbcfe8f8c538a6aad13cff65a07e4059cadaa57032084b2c667219528: Status 404 returned error can't find the container with id 8fde1fafbcfe8f8c538a6aad13cff65a07e4059cadaa57032084b2c667219528 Apr 17 16:58:14.412023 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:14.411937 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" event={"ID":"5ae0eb97-8439-426c-a6e0-6ab39f54aca4","Type":"ContainerStarted","Data":"8fde1fafbcfe8f8c538a6aad13cff65a07e4059cadaa57032084b2c667219528"} Apr 17 16:58:17.427292 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:17.427212 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" event={"ID":"5ae0eb97-8439-426c-a6e0-6ab39f54aca4","Type":"ContainerStarted","Data":"45b2581938c90ec74e9f1b622e6916f6dd0777bbc1221949e7a089cdb5c93980"} Apr 17 16:58:17.427702 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:17.427327 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:17.445191 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:17.445144 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" podStartSLOduration=1.5419715219999999 podStartE2EDuration="4.44513116s" podCreationTimestamp="2026-04-17 16:58:13 +0000 UTC" firstStartedPulling="2026-04-17 16:58:14.201353197 +0000 UTC m=+333.600781706" lastFinishedPulling="2026-04-17 16:58:17.104512831 +0000 UTC m=+336.503941344" observedRunningTime="2026-04-17 16:58:17.44291778 +0000 UTC m=+336.842346314" watchObservedRunningTime="2026-04-17 16:58:17.44513116 +0000 UTC m=+336.844559692" Apr 17 16:58:23.432491 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:23.432464 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-2479v" Apr 17 16:58:35.771997 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.771899 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md"] Apr 17 16:58:35.775302 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.775280 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" Apr 17 16:58:35.777892 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.777864 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:58:35.777996 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.777893 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 16:58:35.778916 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.778896 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-g9426\"" Apr 17 16:58:35.782084 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.781895 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md"] Apr 17 16:58:35.825149 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.825107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/62f3ffce-643f-44b0-bdc2-9782e465be26-tmp\") pod \"openshift-lws-operator-bfc7f696d-vs8md\" (UID: \"62f3ffce-643f-44b0-bdc2-9782e465be26\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" Apr 17 16:58:35.825373 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.825157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv4mw\" (UniqueName: \"kubernetes.io/projected/62f3ffce-643f-44b0-bdc2-9782e465be26-kube-api-access-mv4mw\") pod \"openshift-lws-operator-bfc7f696d-vs8md\" (UID: \"62f3ffce-643f-44b0-bdc2-9782e465be26\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" Apr 17 16:58:35.926515 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.926477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/62f3ffce-643f-44b0-bdc2-9782e465be26-tmp\") pod \"openshift-lws-operator-bfc7f696d-vs8md\" (UID: \"62f3ffce-643f-44b0-bdc2-9782e465be26\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" Apr 17 16:58:35.926709 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.926526 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv4mw\" (UniqueName: \"kubernetes.io/projected/62f3ffce-643f-44b0-bdc2-9782e465be26-kube-api-access-mv4mw\") pod \"openshift-lws-operator-bfc7f696d-vs8md\" (UID: \"62f3ffce-643f-44b0-bdc2-9782e465be26\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" Apr 17 16:58:35.926902 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.926882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/62f3ffce-643f-44b0-bdc2-9782e465be26-tmp\") pod \"openshift-lws-operator-bfc7f696d-vs8md\" (UID: \"62f3ffce-643f-44b0-bdc2-9782e465be26\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" Apr 17 16:58:35.935468 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:35.935435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv4mw\" (UniqueName: \"kubernetes.io/projected/62f3ffce-643f-44b0-bdc2-9782e465be26-kube-api-access-mv4mw\") pod \"openshift-lws-operator-bfc7f696d-vs8md\" (UID: \"62f3ffce-643f-44b0-bdc2-9782e465be26\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" Apr 17 16:58:36.084777 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:36.084688 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" Apr 17 16:58:36.207358 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:36.207313 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md"] Apr 17 16:58:36.209545 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:58:36.209514 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62f3ffce_643f_44b0_bdc2_9782e465be26.slice/crio-a08256250325f2c16777111c768747a900445fd52417092d5e2469b22e1d3cf4 WatchSource:0}: Error finding container a08256250325f2c16777111c768747a900445fd52417092d5e2469b22e1d3cf4: Status 404 returned error can't find the container with id a08256250325f2c16777111c768747a900445fd52417092d5e2469b22e1d3cf4 Apr 17 16:58:36.488777 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:36.488688 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" event={"ID":"62f3ffce-643f-44b0-bdc2-9782e465be26","Type":"ContainerStarted","Data":"a08256250325f2c16777111c768747a900445fd52417092d5e2469b22e1d3cf4"} Apr 17 16:58:39.499524 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:39.499478 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" event={"ID":"62f3ffce-643f-44b0-bdc2-9782e465be26","Type":"ContainerStarted","Data":"95b4b982e7f6e3461235d68c25fbcc2189bf403617a4b73430137ed535f02066"} Apr 17 16:58:39.515626 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:39.515579 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-vs8md" podStartSLOduration=1.96783618 podStartE2EDuration="4.515564377s" podCreationTimestamp="2026-04-17 16:58:35 +0000 UTC" firstStartedPulling="2026-04-17 16:58:36.211071036 +0000 UTC m=+355.610499549" lastFinishedPulling="2026-04-17 16:58:38.758799224 +0000 UTC m=+358.158227746" observedRunningTime="2026-04-17 16:58:39.513466384 +0000 UTC m=+358.912894915" watchObservedRunningTime="2026-04-17 16:58:39.515564377 +0000 UTC m=+358.914992913" Apr 17 16:58:46.625933 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.625897 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs"] Apr 17 16:58:46.629635 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.629613 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.633124 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.633094 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 16:58:46.633124 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.633112 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lb5t5\"" Apr 17 16:58:46.633286 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.633203 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 16:58:46.633286 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.633221 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 16:58:46.640114 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.640094 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs"] Apr 17 16:58:46.709149 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.709113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz4st\" (UniqueName: \"kubernetes.io/projected/95433ad1-fccf-49b7-822b-2001e954db45-kube-api-access-dz4st\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.709354 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.709169 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95433ad1-fccf-49b7-822b-2001e954db45-cert\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.709354 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.709253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95433ad1-fccf-49b7-822b-2001e954db45-manager-config\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.709354 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.709281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95433ad1-fccf-49b7-822b-2001e954db45-metrics-cert\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.810046 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.810005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95433ad1-fccf-49b7-822b-2001e954db45-manager-config\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.810046 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.810047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95433ad1-fccf-49b7-822b-2001e954db45-metrics-cert\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.810242 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.810088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz4st\" (UniqueName: \"kubernetes.io/projected/95433ad1-fccf-49b7-822b-2001e954db45-kube-api-access-dz4st\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.810242 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.810116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95433ad1-fccf-49b7-822b-2001e954db45-cert\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.810718 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.810690 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/95433ad1-fccf-49b7-822b-2001e954db45-manager-config\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.812804 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.812771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95433ad1-fccf-49b7-822b-2001e954db45-cert\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.812894 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.812808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95433ad1-fccf-49b7-822b-2001e954db45-metrics-cert\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.831917 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.831897 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz4st\" (UniqueName: \"kubernetes.io/projected/95433ad1-fccf-49b7-822b-2001e954db45-kube-api-access-dz4st\") pod \"lws-controller-manager-58fcc7cb5-pqvjs\" (UID: \"95433ad1-fccf-49b7-822b-2001e954db45\") " pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:46.939095 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:46.939001 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:47.066389 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:47.066362 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs"] Apr 17 16:58:47.069564 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:58:47.069535 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95433ad1_fccf_49b7_822b_2001e954db45.slice/crio-160c5b3d1e81a25d716d81ef43942f461399e14952c4badf013096e8b06c67e0 WatchSource:0}: Error finding container 160c5b3d1e81a25d716d81ef43942f461399e14952c4badf013096e8b06c67e0: Status 404 returned error can't find the container with id 160c5b3d1e81a25d716d81ef43942f461399e14952c4badf013096e8b06c67e0 Apr 17 16:58:47.529593 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:47.529553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" event={"ID":"95433ad1-fccf-49b7-822b-2001e954db45","Type":"ContainerStarted","Data":"160c5b3d1e81a25d716d81ef43942f461399e14952c4badf013096e8b06c67e0"} Apr 17 16:58:49.537487 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:49.537452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" event={"ID":"95433ad1-fccf-49b7-822b-2001e954db45","Type":"ContainerStarted","Data":"9f1c08af00ca22dde3d762aa4beb8343d1cc48107bb0a0c1f7705de9490fc831"} Apr 17 16:58:49.537988 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:49.537565 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:58:49.553540 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:49.553465 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" podStartSLOduration=1.5986961119999998 podStartE2EDuration="3.553452206s" podCreationTimestamp="2026-04-17 16:58:46 +0000 UTC" firstStartedPulling="2026-04-17 16:58:47.071488277 +0000 UTC m=+366.470916789" lastFinishedPulling="2026-04-17 16:58:49.026244358 +0000 UTC m=+368.425672883" observedRunningTime="2026-04-17 16:58:49.55294247 +0000 UTC m=+368.952371003" watchObservedRunningTime="2026-04-17 16:58:49.553452206 +0000 UTC m=+368.952880737" Apr 17 16:58:54.273100 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.273067 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc"] Apr 17 16:58:54.275680 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.275628 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.278056 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.278036 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 16:58:54.278570 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.278554 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-28vvs\"" Apr 17 16:58:54.278756 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.278743 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 16:58:54.278851 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.278835 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 16:58:54.284784 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.284761 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 16:58:54.290484 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.290464 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc"] Apr 17 16:58:54.374956 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.374914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/48fa93bb-c551-41d6-b51c-48796a50bab3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.375125 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.375035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/48fa93bb-c551-41d6-b51c-48796a50bab3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.375125 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.375085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpx4n\" (UniqueName: \"kubernetes.io/projected/48fa93bb-c551-41d6-b51c-48796a50bab3-kube-api-access-zpx4n\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.476019 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.475978 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/48fa93bb-c551-41d6-b51c-48796a50bab3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.476176 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.476036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpx4n\" (UniqueName: \"kubernetes.io/projected/48fa93bb-c551-41d6-b51c-48796a50bab3-kube-api-access-zpx4n\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.476176 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.476057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/48fa93bb-c551-41d6-b51c-48796a50bab3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.478785 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.478761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/48fa93bb-c551-41d6-b51c-48796a50bab3-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.478888 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.478808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/48fa93bb-c551-41d6-b51c-48796a50bab3-webhook-cert\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.488030 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.488002 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpx4n\" (UniqueName: \"kubernetes.io/projected/48fa93bb-c551-41d6-b51c-48796a50bab3-kube-api-access-zpx4n\") pod \"opendatahub-operator-controller-manager-6b98d9f7df-fcdvc\" (UID: \"48fa93bb-c551-41d6-b51c-48796a50bab3\") " pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.585730 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.585632 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:54.712878 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:54.712844 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc"] Apr 17 16:58:54.714894 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:58:54.714870 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fa93bb_c551_41d6_b51c_48796a50bab3.slice/crio-d8c873723230e0401a87e496205bca30fb6aa5ccb29405d8330c1ca9ab6ef2f3 WatchSource:0}: Error finding container d8c873723230e0401a87e496205bca30fb6aa5ccb29405d8330c1ca9ab6ef2f3: Status 404 returned error can't find the container with id d8c873723230e0401a87e496205bca30fb6aa5ccb29405d8330c1ca9ab6ef2f3 Apr 17 16:58:55.562045 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:55.561988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" event={"ID":"48fa93bb-c551-41d6-b51c-48796a50bab3","Type":"ContainerStarted","Data":"d8c873723230e0401a87e496205bca30fb6aa5ccb29405d8330c1ca9ab6ef2f3"} Apr 17 16:58:57.570667 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:57.570610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" event={"ID":"48fa93bb-c551-41d6-b51c-48796a50bab3","Type":"ContainerStarted","Data":"6fd5e27298221a5e89423455e079bb51692419e81c5081d1f3c49a985d3d04e4"} Apr 17 16:58:57.571082 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:57.570758 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:58:57.596678 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:58:57.596591 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" podStartSLOduration=1.207181015 podStartE2EDuration="3.596576591s" podCreationTimestamp="2026-04-17 16:58:54 +0000 UTC" firstStartedPulling="2026-04-17 16:58:54.716535822 +0000 UTC m=+374.115964333" lastFinishedPulling="2026-04-17 16:58:57.105931396 +0000 UTC m=+376.505359909" observedRunningTime="2026-04-17 16:58:57.595535428 +0000 UTC m=+376.994963963" watchObservedRunningTime="2026-04-17 16:58:57.596576591 +0000 UTC m=+376.996005117" Apr 17 16:59:00.544034 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:00.543995 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-58fcc7cb5-pqvjs" Apr 17 16:59:08.575808 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:08.575776 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6b98d9f7df-fcdvc" Apr 17 16:59:52.113543 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.113507 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm"] Apr 17 16:59:52.123235 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.123206 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.125770 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.125741 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-jqc2c\"" Apr 17 16:59:52.125910 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.125745 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 16:59:52.126363 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.126328 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm"] Apr 17 16:59:52.163834 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.163803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.163960 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.163850 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.163960 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.163867 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.163960 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.163930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.164066 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.163986 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj297\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-kube-api-access-gj297\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.164066 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.164009 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.164128 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.164060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.164128 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.164092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.164128 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.164114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265046 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265011 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265046 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265291 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265291 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265382 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265609 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265645 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.265771 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.265513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.266171 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.266077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj297\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-kube-api-access-gj297\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.266171 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.266094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.266171 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.266126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.266774 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.266741 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.267992 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.267969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.268615 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.268597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.274126 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.274099 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.274228 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.274199 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj297\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-kube-api-access-gj297\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.434928 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.434836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:52.562715 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.562626 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm"] Apr 17 16:59:52.565079 ip-10-0-138-47 kubenswrapper[2574]: W0417 16:59:52.565045 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa28ecf5_e670_42d1_93a7_b596f8d23bbe.slice/crio-f3a7e17b928f62c7d4ae1f55f085d753cc011f2ae67671a24e5728c5789f74a7 WatchSource:0}: Error finding container f3a7e17b928f62c7d4ae1f55f085d753cc011f2ae67671a24e5728c5789f74a7: Status 404 returned error can't find the container with id f3a7e17b928f62c7d4ae1f55f085d753cc011f2ae67671a24e5728c5789f74a7 Apr 17 16:59:52.751965 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:52.751874 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" event={"ID":"aa28ecf5-e670-42d1-93a7-b596f8d23bbe","Type":"ContainerStarted","Data":"f3a7e17b928f62c7d4ae1f55f085d753cc011f2ae67671a24e5728c5789f74a7"} Apr 17 16:59:54.792215 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:54.792178 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 17 16:59:54.792502 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:54.792252 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 17 16:59:54.792502 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:54.792293 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 17 16:59:55.762693 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:55.762636 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" event={"ID":"aa28ecf5-e670-42d1-93a7-b596f8d23bbe","Type":"ContainerStarted","Data":"9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31"} Apr 17 16:59:55.782828 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:55.782776 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" podStartSLOduration=1.558243326 podStartE2EDuration="3.782760932s" podCreationTimestamp="2026-04-17 16:59:52 +0000 UTC" firstStartedPulling="2026-04-17 16:59:52.567434504 +0000 UTC m=+431.966863027" lastFinishedPulling="2026-04-17 16:59:54.791952118 +0000 UTC m=+434.191380633" observedRunningTime="2026-04-17 16:59:55.781452931 +0000 UTC m=+435.180881464" watchObservedRunningTime="2026-04-17 16:59:55.782760932 +0000 UTC m=+435.182189533" Apr 17 16:59:56.435538 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:56.435500 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 16:59:56.436936 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:56.436904 2574 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" start-of-body= Apr 17 16:59:56.437058 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:56.436959 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" podUID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 17 16:59:57.436036 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:57.436002 2574 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" start-of-body= Apr 17 16:59:57.436540 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:57.436054 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" podUID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 17 16:59:57.994587 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:57.994554 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm"] Apr 17 16:59:58.435954 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:58.435921 2574 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" start-of-body= Apr 17 16:59:58.436127 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:58.435970 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" podUID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.29:15021/healthz/ready\": dial tcp 10.134.0.29:15021: connect: connection refused" Apr 17 16:59:58.775626 ip-10-0-138-47 kubenswrapper[2574]: I0417 16:59:58.775562 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" podUID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" containerName="istio-proxy" containerID="cri-o://9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31" gracePeriod=30 Apr 17 17:00:04.010704 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.010678 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 17:00:04.067829 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.067744 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-envoy\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.067980 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.067833 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-credential-socket\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.067980 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.067870 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-certs\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.067980 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.067945 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-data\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.068139 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.067991 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istiod-ca-cert\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.068139 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068032 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-token\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.068139 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068077 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-podinfo\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.068139 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068131 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj297\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-kube-api-access-gj297\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.068335 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068166 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-socket\") pod \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\" (UID: \"aa28ecf5-e670-42d1-93a7-b596f8d23bbe\") " Apr 17 17:00:04.068419 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068159 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:04.068419 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068166 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:04.068572 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068436 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:04.068572 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068449 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:00:04.068724 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.068564 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-data" (OuterVolumeSpecName: "istio-data") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:04.070355 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.070333 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:04.070450 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.070384 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-kube-api-access-gj297" (OuterVolumeSpecName: "kube-api-access-gj297") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "kube-api-access-gj297". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:00:04.070495 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.070477 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-token" (OuterVolumeSpecName: "istio-token") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:00:04.070531 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.070493 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "aa28ecf5-e670-42d1-93a7-b596f8d23bbe" (UID: "aa28ecf5-e670-42d1-93a7-b596f8d23bbe"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 17 17:00:04.169670 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169635 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gj297\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-kube-api-access-gj297\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.169837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169687 2574 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-socket\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.169837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169698 2574 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-envoy\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.169837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169706 2574 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-credential-socket\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.169837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169714 2574 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-workload-certs\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.169837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169723 2574 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-data\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.169837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169731 2574 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istiod-ca-cert\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.169837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169740 2574 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-token\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.169837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.169747 2574 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/aa28ecf5-e670-42d1-93a7-b596f8d23bbe-istio-podinfo\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:04.796818 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.796778 2574 generic.go:358] "Generic (PLEG): container finished" podID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" containerID="9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31" exitCode=0 Apr 17 17:00:04.796985 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.796852 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" Apr 17 17:00:04.796985 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.796851 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" event={"ID":"aa28ecf5-e670-42d1-93a7-b596f8d23bbe","Type":"ContainerDied","Data":"9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31"} Apr 17 17:00:04.796985 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.796963 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm" event={"ID":"aa28ecf5-e670-42d1-93a7-b596f8d23bbe","Type":"ContainerDied","Data":"f3a7e17b928f62c7d4ae1f55f085d753cc011f2ae67671a24e5728c5789f74a7"} Apr 17 17:00:04.796985 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.796982 2574 scope.go:117] "RemoveContainer" containerID="9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31" Apr 17 17:00:04.805344 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.805327 2574 scope.go:117] "RemoveContainer" containerID="9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31" Apr 17 17:00:04.805608 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:00:04.805590 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31\": container with ID starting with 9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31 not found: ID does not exist" containerID="9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31" Apr 17 17:00:04.805678 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.805617 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31"} err="failed to get container status \"9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31\": rpc error: code = NotFound desc = could not find container \"9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31\": container with ID starting with 9ce27cf2984b8e539833b5201bd7ba98395817d65b32d5ad748c9d7d3c1bdd31 not found: ID does not exist" Apr 17 17:00:04.823081 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.820627 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm"] Apr 17 17:00:04.824893 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:04.824871 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd57snmm"] Apr 17 17:00:05.167124 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:05.167043 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" path="/var/lib/kubelet/pods/aa28ecf5-e670-42d1-93a7-b596f8d23bbe/volumes" Apr 17 17:00:12.183482 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.183448 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nrsrz"] Apr 17 17:00:12.183986 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.183766 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" containerName="istio-proxy" Apr 17 17:00:12.183986 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.183777 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" containerName="istio-proxy" Apr 17 17:00:12.183986 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.183840 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa28ecf5-e670-42d1-93a7-b596f8d23bbe" containerName="istio-proxy" Apr 17 17:00:12.188068 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.188051 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" Apr 17 17:00:12.191800 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.191762 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:00:12.191800 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.191777 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-lmw65\"" Apr 17 17:00:12.192035 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.191763 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:00:12.194088 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.194065 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nrsrz"] Apr 17 17:00:12.343287 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.343254 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zvfx\" (UniqueName: \"kubernetes.io/projected/aaa2808d-bcb1-47d6-9c38-d10ad4097766-kube-api-access-2zvfx\") pod \"kuadrant-operator-catalog-nrsrz\" (UID: \"aaa2808d-bcb1-47d6-9c38-d10ad4097766\") " pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" Apr 17 17:00:12.444204 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.444111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zvfx\" (UniqueName: \"kubernetes.io/projected/aaa2808d-bcb1-47d6-9c38-d10ad4097766-kube-api-access-2zvfx\") pod \"kuadrant-operator-catalog-nrsrz\" (UID: \"aaa2808d-bcb1-47d6-9c38-d10ad4097766\") " pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" Apr 17 17:00:12.452229 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.452199 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zvfx\" (UniqueName: \"kubernetes.io/projected/aaa2808d-bcb1-47d6-9c38-d10ad4097766-kube-api-access-2zvfx\") pod \"kuadrant-operator-catalog-nrsrz\" (UID: \"aaa2808d-bcb1-47d6-9c38-d10ad4097766\") " pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" Apr 17 17:00:12.498343 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.498309 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" Apr 17 17:00:12.555221 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.555158 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nrsrz"] Apr 17 17:00:12.620363 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.620341 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nrsrz"] Apr 17 17:00:12.623048 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:12.623022 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaa2808d_bcb1_47d6_9c38_d10ad4097766.slice/crio-7850510d93dc6074f9d7acd98854168a4ed4b164f9b0db938305bfb364c94150 WatchSource:0}: Error finding container 7850510d93dc6074f9d7acd98854168a4ed4b164f9b0db938305bfb364c94150: Status 404 returned error can't find the container with id 7850510d93dc6074f9d7acd98854168a4ed4b164f9b0db938305bfb364c94150 Apr 17 17:00:12.763314 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.763231 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t295l"] Apr 17 17:00:12.767830 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.767815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:12.773527 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.773495 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t295l"] Apr 17 17:00:12.826215 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.826183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" event={"ID":"aaa2808d-bcb1-47d6-9c38-d10ad4097766","Type":"ContainerStarted","Data":"7850510d93dc6074f9d7acd98854168a4ed4b164f9b0db938305bfb364c94150"} Apr 17 17:00:12.847595 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.847570 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxggq\" (UniqueName: \"kubernetes.io/projected/f36a0158-13e3-4493-85aa-eae8dc7f088a-kube-api-access-sxggq\") pod \"kuadrant-operator-catalog-t295l\" (UID: \"f36a0158-13e3-4493-85aa-eae8dc7f088a\") " pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:12.948944 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.948904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxggq\" (UniqueName: \"kubernetes.io/projected/f36a0158-13e3-4493-85aa-eae8dc7f088a-kube-api-access-sxggq\") pod \"kuadrant-operator-catalog-t295l\" (UID: \"f36a0158-13e3-4493-85aa-eae8dc7f088a\") " pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:12.956857 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:12.956829 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxggq\" (UniqueName: \"kubernetes.io/projected/f36a0158-13e3-4493-85aa-eae8dc7f088a-kube-api-access-sxggq\") pod \"kuadrant-operator-catalog-t295l\" (UID: \"f36a0158-13e3-4493-85aa-eae8dc7f088a\") " pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:13.078298 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:13.078258 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:13.210132 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:13.210085 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t295l"] Apr 17 17:00:13.254220 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:13.254173 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf36a0158_13e3_4493_85aa_eae8dc7f088a.slice/crio-070360b9e512f2accb52461d3b0d94be9499fae5241047f032d8a79a03bdcf5d WatchSource:0}: Error finding container 070360b9e512f2accb52461d3b0d94be9499fae5241047f032d8a79a03bdcf5d: Status 404 returned error can't find the container with id 070360b9e512f2accb52461d3b0d94be9499fae5241047f032d8a79a03bdcf5d Apr 17 17:00:13.830873 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:13.830830 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t295l" event={"ID":"f36a0158-13e3-4493-85aa-eae8dc7f088a","Type":"ContainerStarted","Data":"070360b9e512f2accb52461d3b0d94be9499fae5241047f032d8a79a03bdcf5d"} Apr 17 17:00:15.838294 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:15.838257 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" event={"ID":"aaa2808d-bcb1-47d6-9c38-d10ad4097766","Type":"ContainerStarted","Data":"33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795"} Apr 17 17:00:15.838768 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:15.838371 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" podUID="aaa2808d-bcb1-47d6-9c38-d10ad4097766" containerName="registry-server" containerID="cri-o://33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795" gracePeriod=2 Apr 17 17:00:15.839961 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:15.839926 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t295l" event={"ID":"f36a0158-13e3-4493-85aa-eae8dc7f088a","Type":"ContainerStarted","Data":"58aacd878c5a5bac7167f7c0438ce5fc92d2357327ffdc7d3ffeb296c46f1333"} Apr 17 17:00:15.858166 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:15.858122 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" podStartSLOduration=1.64466325 podStartE2EDuration="3.858104537s" podCreationTimestamp="2026-04-17 17:00:12 +0000 UTC" firstStartedPulling="2026-04-17 17:00:12.624481621 +0000 UTC m=+452.023910142" lastFinishedPulling="2026-04-17 17:00:14.837922919 +0000 UTC m=+454.237351429" observedRunningTime="2026-04-17 17:00:15.856913396 +0000 UTC m=+455.256341928" watchObservedRunningTime="2026-04-17 17:00:15.858104537 +0000 UTC m=+455.257533066" Apr 17 17:00:15.875188 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:15.875141 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-t295l" podStartSLOduration=2.289722934 podStartE2EDuration="3.875127865s" podCreationTimestamp="2026-04-17 17:00:12 +0000 UTC" firstStartedPulling="2026-04-17 17:00:13.255717841 +0000 UTC m=+452.655146355" lastFinishedPulling="2026-04-17 17:00:14.841122776 +0000 UTC m=+454.240551286" observedRunningTime="2026-04-17 17:00:15.873336073 +0000 UTC m=+455.272764616" watchObservedRunningTime="2026-04-17 17:00:15.875127865 +0000 UTC m=+455.274556396" Apr 17 17:00:16.073032 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.073009 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" Apr 17 17:00:16.179577 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.179494 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zvfx\" (UniqueName: \"kubernetes.io/projected/aaa2808d-bcb1-47d6-9c38-d10ad4097766-kube-api-access-2zvfx\") pod \"aaa2808d-bcb1-47d6-9c38-d10ad4097766\" (UID: \"aaa2808d-bcb1-47d6-9c38-d10ad4097766\") " Apr 17 17:00:16.181898 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.181866 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa2808d-bcb1-47d6-9c38-d10ad4097766-kube-api-access-2zvfx" (OuterVolumeSpecName: "kube-api-access-2zvfx") pod "aaa2808d-bcb1-47d6-9c38-d10ad4097766" (UID: "aaa2808d-bcb1-47d6-9c38-d10ad4097766"). InnerVolumeSpecName "kube-api-access-2zvfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:00:16.280486 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.280451 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zvfx\" (UniqueName: \"kubernetes.io/projected/aaa2808d-bcb1-47d6-9c38-d10ad4097766-kube-api-access-2zvfx\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:16.844748 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.844713 2574 generic.go:358] "Generic (PLEG): container finished" podID="aaa2808d-bcb1-47d6-9c38-d10ad4097766" containerID="33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795" exitCode=0 Apr 17 17:00:16.845167 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.844775 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" Apr 17 17:00:16.845167 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.844803 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" event={"ID":"aaa2808d-bcb1-47d6-9c38-d10ad4097766","Type":"ContainerDied","Data":"33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795"} Apr 17 17:00:16.845167 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.844838 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nrsrz" event={"ID":"aaa2808d-bcb1-47d6-9c38-d10ad4097766","Type":"ContainerDied","Data":"7850510d93dc6074f9d7acd98854168a4ed4b164f9b0db938305bfb364c94150"} Apr 17 17:00:16.845167 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.844853 2574 scope.go:117] "RemoveContainer" containerID="33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795" Apr 17 17:00:16.855670 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.855599 2574 scope.go:117] "RemoveContainer" containerID="33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795" Apr 17 17:00:16.856182 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:00:16.856164 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795\": container with ID starting with 33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795 not found: ID does not exist" containerID="33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795" Apr 17 17:00:16.856232 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.856192 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795"} err="failed to get container status \"33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795\": rpc error: code = NotFound desc = could not find container \"33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795\": container with ID starting with 33992cb39076533a1c1b72e4dd68272b178aeb44ab40874ca9c9d26fcf69e795 not found: ID does not exist" Apr 17 17:00:16.864849 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.864827 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nrsrz"] Apr 17 17:00:16.868206 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:16.868185 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nrsrz"] Apr 17 17:00:17.167933 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:17.167846 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa2808d-bcb1-47d6-9c38-d10ad4097766" path="/var/lib/kubelet/pods/aaa2808d-bcb1-47d6-9c38-d10ad4097766/volumes" Apr 17 17:00:23.079031 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:23.078975 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:23.079031 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:23.079037 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:23.100846 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:23.100823 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:23.889994 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:23.889967 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-t295l" Apr 17 17:00:27.369911 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.369876 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr"] Apr 17 17:00:27.370276 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.370200 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaa2808d-bcb1-47d6-9c38-d10ad4097766" containerName="registry-server" Apr 17 17:00:27.370276 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.370210 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa2808d-bcb1-47d6-9c38-d10ad4097766" containerName="registry-server" Apr 17 17:00:27.370276 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.370276 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaa2808d-bcb1-47d6-9c38-d10ad4097766" containerName="registry-server" Apr 17 17:00:27.378315 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.378288 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.380508 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.380483 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr"] Apr 17 17:00:27.380914 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.380892 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-rt5sd\"" Apr 17 17:00:27.478695 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.478632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.478890 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.478762 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brc76\" (UniqueName: \"kubernetes.io/projected/bf2b1a44-aabd-490e-8361-7f228a0882ff-kube-api-access-brc76\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.478890 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.478808 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.579503 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.579463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brc76\" (UniqueName: \"kubernetes.io/projected/bf2b1a44-aabd-490e-8361-7f228a0882ff-kube-api-access-brc76\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.579503 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.579508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.579781 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.579590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.579923 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.579899 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.579988 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.579949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.588453 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.588420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brc76\" (UniqueName: \"kubernetes.io/projected/bf2b1a44-aabd-490e-8361-7f228a0882ff-kube-api-access-brc76\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:27.688537 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:27.688441 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:28.017471 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.017280 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr"] Apr 17 17:00:28.020226 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:28.020195 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf2b1a44_aabd_490e_8361_7f228a0882ff.slice/crio-8e3b3da27be454440c073f6734f9ee27172cfafbba972047073442d18075a9e9 WatchSource:0}: Error finding container 8e3b3da27be454440c073f6734f9ee27172cfafbba972047073442d18075a9e9: Status 404 returned error can't find the container with id 8e3b3da27be454440c073f6734f9ee27172cfafbba972047073442d18075a9e9 Apr 17 17:00:28.170395 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.170366 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58"] Apr 17 17:00:28.173759 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.173744 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.180122 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.180095 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58"] Apr 17 17:00:28.184122 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.184098 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.184247 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.184135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phqpg\" (UniqueName: \"kubernetes.io/projected/81e83978-694c-40d6-b6d7-672c2b14b168-kube-api-access-phqpg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.184247 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.184195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.285359 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.285328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phqpg\" (UniqueName: \"kubernetes.io/projected/81e83978-694c-40d6-b6d7-672c2b14b168-kube-api-access-phqpg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.285536 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.285383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.285536 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.285505 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.285784 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.285765 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.285839 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.285818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.294002 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.293979 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phqpg\" (UniqueName: \"kubernetes.io/projected/81e83978-694c-40d6-b6d7-672c2b14b168-kube-api-access-phqpg\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.491955 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.491912 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:28.568904 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.568715 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f"] Apr 17 17:00:28.575672 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.575631 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.579976 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.579947 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f"] Apr 17 17:00:28.587853 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.587820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.587853 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.587866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.588052 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.587935 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qrjn\" (UniqueName: \"kubernetes.io/projected/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-kube-api-access-4qrjn\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.620080 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.620057 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58"] Apr 17 17:00:28.622163 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:28.622133 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e83978_694c_40d6_b6d7_672c2b14b168.slice/crio-02b5aacfb50f0fc4256cf2723edaae73cbe0401fbbc6346ee2706b107bd41e61 WatchSource:0}: Error finding container 02b5aacfb50f0fc4256cf2723edaae73cbe0401fbbc6346ee2706b107bd41e61: Status 404 returned error can't find the container with id 02b5aacfb50f0fc4256cf2723edaae73cbe0401fbbc6346ee2706b107bd41e61 Apr 17 17:00:28.688564 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.688542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.688623 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.688573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.688683 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.688631 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qrjn\" (UniqueName: \"kubernetes.io/projected/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-kube-api-access-4qrjn\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.694712 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.689181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.694712 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.689321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.697779 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.697758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qrjn\" (UniqueName: \"kubernetes.io/projected/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-kube-api-access-4qrjn\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:28.885082 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.884975 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerID="97cfa7bed4d7f1bbae1993adead42b241c855fbd66c13efdb29c140cf48e6230" exitCode=0 Apr 17 17:00:28.885082 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.885061 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" event={"ID":"bf2b1a44-aabd-490e-8361-7f228a0882ff","Type":"ContainerDied","Data":"97cfa7bed4d7f1bbae1993adead42b241c855fbd66c13efdb29c140cf48e6230"} Apr 17 17:00:28.885310 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.885098 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" event={"ID":"bf2b1a44-aabd-490e-8361-7f228a0882ff","Type":"ContainerStarted","Data":"8e3b3da27be454440c073f6734f9ee27172cfafbba972047073442d18075a9e9"} Apr 17 17:00:28.886557 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.886532 2574 generic.go:358] "Generic (PLEG): container finished" podID="81e83978-694c-40d6-b6d7-672c2b14b168" containerID="bf7f5d2f15c906ae940bf54ad4012ebee2b49e1756b583f23c7367d75ae2da52" exitCode=0 Apr 17 17:00:28.886649 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.886584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" event={"ID":"81e83978-694c-40d6-b6d7-672c2b14b168","Type":"ContainerDied","Data":"bf7f5d2f15c906ae940bf54ad4012ebee2b49e1756b583f23c7367d75ae2da52"} Apr 17 17:00:28.886649 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.886606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" event={"ID":"81e83978-694c-40d6-b6d7-672c2b14b168","Type":"ContainerStarted","Data":"02b5aacfb50f0fc4256cf2723edaae73cbe0401fbbc6346ee2706b107bd41e61"} Apr 17 17:00:28.888604 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:28.888590 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:29.007881 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.007850 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4"] Apr 17 17:00:29.012696 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.012587 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.012696 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.012631 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f"] Apr 17 17:00:29.014721 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:29.014696 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b253b8b_4e0a_4890_a43a_c3f414b85c4f.slice/crio-c66af64d9119c91a89ac9aa176776ef505a800ffc2d50b5dbea0d1dbf6d7c5f7 WatchSource:0}: Error finding container c66af64d9119c91a89ac9aa176776ef505a800ffc2d50b5dbea0d1dbf6d7c5f7: Status 404 returned error can't find the container with id c66af64d9119c91a89ac9aa176776ef505a800ffc2d50b5dbea0d1dbf6d7c5f7 Apr 17 17:00:29.021995 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.021972 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4"] Apr 17 17:00:29.092690 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.092640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.092846 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.092715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.092846 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.092757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk8mg\" (UniqueName: \"kubernetes.io/projected/fa84bb84-5411-451e-82bb-4de3b078eb23-kube-api-access-bk8mg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.194211 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.194101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.194211 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.194160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.194211 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.194196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bk8mg\" (UniqueName: \"kubernetes.io/projected/fa84bb84-5411-451e-82bb-4de3b078eb23-kube-api-access-bk8mg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.194577 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.194550 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.194627 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.194563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.202485 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.202457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk8mg\" (UniqueName: \"kubernetes.io/projected/fa84bb84-5411-451e-82bb-4de3b078eb23-kube-api-access-bk8mg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.327225 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.327193 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:29.448912 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.448887 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4"] Apr 17 17:00:29.450856 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:29.450829 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa84bb84_5411_451e_82bb_4de3b078eb23.slice/crio-635d6cf3f583ecffe31fbb71d7b7bd0a8b02cb4b3b8d9358d4f3efe2339f5be5 WatchSource:0}: Error finding container 635d6cf3f583ecffe31fbb71d7b7bd0a8b02cb4b3b8d9358d4f3efe2339f5be5: Status 404 returned error can't find the container with id 635d6cf3f583ecffe31fbb71d7b7bd0a8b02cb4b3b8d9358d4f3efe2339f5be5 Apr 17 17:00:29.893463 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.893430 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerID="ff5d5f6fb5d563fde2c09ae8aeff44b29e4996c35ecf0252f1fea10307e0a03f" exitCode=0 Apr 17 17:00:29.893911 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.893514 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" event={"ID":"bf2b1a44-aabd-490e-8361-7f228a0882ff","Type":"ContainerDied","Data":"ff5d5f6fb5d563fde2c09ae8aeff44b29e4996c35ecf0252f1fea10307e0a03f"} Apr 17 17:00:29.895050 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.895026 2574 generic.go:358] "Generic (PLEG): container finished" podID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerID="7ff491de2c0afd5778422192221c7fd39d7e7d09b3f35ff94944baf5ee09d920" exitCode=0 Apr 17 17:00:29.895172 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.895102 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" event={"ID":"fa84bb84-5411-451e-82bb-4de3b078eb23","Type":"ContainerDied","Data":"7ff491de2c0afd5778422192221c7fd39d7e7d09b3f35ff94944baf5ee09d920"} Apr 17 17:00:29.895172 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.895128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" event={"ID":"fa84bb84-5411-451e-82bb-4de3b078eb23","Type":"ContainerStarted","Data":"635d6cf3f583ecffe31fbb71d7b7bd0a8b02cb4b3b8d9358d4f3efe2339f5be5"} Apr 17 17:00:29.896521 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.896496 2574 generic.go:358] "Generic (PLEG): container finished" podID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerID="d6ac0125d72e04ca51c46648926a6ecea29f27fb6e20bd1e69c33e9b18ab993d" exitCode=0 Apr 17 17:00:29.896605 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.896535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" event={"ID":"9b253b8b-4e0a-4890-a43a-c3f414b85c4f","Type":"ContainerDied","Data":"d6ac0125d72e04ca51c46648926a6ecea29f27fb6e20bd1e69c33e9b18ab993d"} Apr 17 17:00:29.896605 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:29.896570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" event={"ID":"9b253b8b-4e0a-4890-a43a-c3f414b85c4f","Type":"ContainerStarted","Data":"c66af64d9119c91a89ac9aa176776ef505a800ffc2d50b5dbea0d1dbf6d7c5f7"} Apr 17 17:00:30.902816 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:30.902729 2574 generic.go:358] "Generic (PLEG): container finished" podID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerID="d45e5b12cfb727b14964538279eb4fed3aab3eb38313465fed79bd7796ebd09a" exitCode=0 Apr 17 17:00:30.902816 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:30.902791 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" event={"ID":"bf2b1a44-aabd-490e-8361-7f228a0882ff","Type":"ContainerDied","Data":"d45e5b12cfb727b14964538279eb4fed3aab3eb38313465fed79bd7796ebd09a"} Apr 17 17:00:30.904364 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:30.904335 2574 generic.go:358] "Generic (PLEG): container finished" podID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerID="9a9c857e178ad68f820ebf61e7440190e76780389c673af48d9eaa8ecdb34a3c" exitCode=0 Apr 17 17:00:30.904492 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:30.904404 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" event={"ID":"fa84bb84-5411-451e-82bb-4de3b078eb23","Type":"ContainerDied","Data":"9a9c857e178ad68f820ebf61e7440190e76780389c673af48d9eaa8ecdb34a3c"} Apr 17 17:00:30.906120 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:30.906103 2574 generic.go:358] "Generic (PLEG): container finished" podID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerID="0ab6600d3adff0a80fe01a5d2cb1eb7894fe2426c1bdbc7fc60c67e2b18ad070" exitCode=0 Apr 17 17:00:30.906198 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:30.906175 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" event={"ID":"9b253b8b-4e0a-4890-a43a-c3f414b85c4f","Type":"ContainerDied","Data":"0ab6600d3adff0a80fe01a5d2cb1eb7894fe2426c1bdbc7fc60c67e2b18ad070"} Apr 17 17:00:30.907941 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:30.907917 2574 generic.go:358] "Generic (PLEG): container finished" podID="81e83978-694c-40d6-b6d7-672c2b14b168" containerID="084923a8a5e68c4b42f33581048cb00db0b0faee3baca60fafef257841767692" exitCode=0 Apr 17 17:00:30.908021 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:30.907953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" event={"ID":"81e83978-694c-40d6-b6d7-672c2b14b168","Type":"ContainerDied","Data":"084923a8a5e68c4b42f33581048cb00db0b0faee3baca60fafef257841767692"} Apr 17 17:00:31.913157 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:31.913120 2574 generic.go:358] "Generic (PLEG): container finished" podID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerID="e18e545144a79806a5472e0863614ec5546247559687271ee775abdb18270341" exitCode=0 Apr 17 17:00:31.913628 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:31.913191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" event={"ID":"fa84bb84-5411-451e-82bb-4de3b078eb23","Type":"ContainerDied","Data":"e18e545144a79806a5472e0863614ec5546247559687271ee775abdb18270341"} Apr 17 17:00:31.914852 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:31.914828 2574 generic.go:358] "Generic (PLEG): container finished" podID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerID="7b6cf5c939a64ddd24fd461ed8c6f3b490910b8e8ae54277e5d6ccb7b6b638f3" exitCode=0 Apr 17 17:00:31.914944 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:31.914877 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" event={"ID":"9b253b8b-4e0a-4890-a43a-c3f414b85c4f","Type":"ContainerDied","Data":"7b6cf5c939a64ddd24fd461ed8c6f3b490910b8e8ae54277e5d6ccb7b6b638f3"} Apr 17 17:00:31.916493 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:31.916470 2574 generic.go:358] "Generic (PLEG): container finished" podID="81e83978-694c-40d6-b6d7-672c2b14b168" containerID="73f23f83c474d48cb2fc218904590463c0971fc07db01f663941f810ac6626db" exitCode=0 Apr 17 17:00:31.916563 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:31.916544 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" event={"ID":"81e83978-694c-40d6-b6d7-672c2b14b168","Type":"ContainerDied","Data":"73f23f83c474d48cb2fc218904590463c0971fc07db01f663941f810ac6626db"} Apr 17 17:00:32.045377 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.045356 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:32.116508 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.116469 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-util\") pod \"bf2b1a44-aabd-490e-8361-7f228a0882ff\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " Apr 17 17:00:32.116683 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.116541 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-bundle\") pod \"bf2b1a44-aabd-490e-8361-7f228a0882ff\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " Apr 17 17:00:32.116683 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.116579 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brc76\" (UniqueName: \"kubernetes.io/projected/bf2b1a44-aabd-490e-8361-7f228a0882ff-kube-api-access-brc76\") pod \"bf2b1a44-aabd-490e-8361-7f228a0882ff\" (UID: \"bf2b1a44-aabd-490e-8361-7f228a0882ff\") " Apr 17 17:00:32.117111 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.117078 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-bundle" (OuterVolumeSpecName: "bundle") pod "bf2b1a44-aabd-490e-8361-7f228a0882ff" (UID: "bf2b1a44-aabd-490e-8361-7f228a0882ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:32.118962 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.118937 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf2b1a44-aabd-490e-8361-7f228a0882ff-kube-api-access-brc76" (OuterVolumeSpecName: "kube-api-access-brc76") pod "bf2b1a44-aabd-490e-8361-7f228a0882ff" (UID: "bf2b1a44-aabd-490e-8361-7f228a0882ff"). InnerVolumeSpecName "kube-api-access-brc76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:00:32.123756 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.123730 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-util" (OuterVolumeSpecName: "util") pod "bf2b1a44-aabd-490e-8361-7f228a0882ff" (UID: "bf2b1a44-aabd-490e-8361-7f228a0882ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:32.217242 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.217141 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:32.217242 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.217173 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-brc76\" (UniqueName: \"kubernetes.io/projected/bf2b1a44-aabd-490e-8361-7f228a0882ff-kube-api-access-brc76\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:32.217242 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.217185 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bf2b1a44-aabd-490e-8361-7f228a0882ff-util\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:32.923368 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.923334 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" Apr 17 17:00:32.923834 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.923369 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr" event={"ID":"bf2b1a44-aabd-490e-8361-7f228a0882ff","Type":"ContainerDied","Data":"8e3b3da27be454440c073f6734f9ee27172cfafbba972047073442d18075a9e9"} Apr 17 17:00:32.923834 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:32.923408 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e3b3da27be454440c073f6734f9ee27172cfafbba972047073442d18075a9e9" Apr 17 17:00:33.074468 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.074447 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:33.110411 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.110390 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:33.113704 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.113684 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:33.124062 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124043 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qrjn\" (UniqueName: \"kubernetes.io/projected/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-kube-api-access-4qrjn\") pod \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " Apr 17 17:00:33.124159 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124078 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-bundle\") pod \"fa84bb84-5411-451e-82bb-4de3b078eb23\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " Apr 17 17:00:33.124159 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124116 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-util\") pod \"81e83978-694c-40d6-b6d7-672c2b14b168\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " Apr 17 17:00:33.124269 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124177 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phqpg\" (UniqueName: \"kubernetes.io/projected/81e83978-694c-40d6-b6d7-672c2b14b168-kube-api-access-phqpg\") pod \"81e83978-694c-40d6-b6d7-672c2b14b168\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " Apr 17 17:00:33.124269 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124217 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-util\") pod \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " Apr 17 17:00:33.124269 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124247 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-bundle\") pod \"81e83978-694c-40d6-b6d7-672c2b14b168\" (UID: \"81e83978-694c-40d6-b6d7-672c2b14b168\") " Apr 17 17:00:33.124417 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124311 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-util\") pod \"fa84bb84-5411-451e-82bb-4de3b078eb23\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " Apr 17 17:00:33.124417 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124340 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk8mg\" (UniqueName: \"kubernetes.io/projected/fa84bb84-5411-451e-82bb-4de3b078eb23-kube-api-access-bk8mg\") pod \"fa84bb84-5411-451e-82bb-4de3b078eb23\" (UID: \"fa84bb84-5411-451e-82bb-4de3b078eb23\") " Apr 17 17:00:33.124417 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124379 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-bundle\") pod \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\" (UID: \"9b253b8b-4e0a-4890-a43a-c3f414b85c4f\") " Apr 17 17:00:33.125204 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.124973 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-bundle" (OuterVolumeSpecName: "bundle") pod "9b253b8b-4e0a-4890-a43a-c3f414b85c4f" (UID: "9b253b8b-4e0a-4890-a43a-c3f414b85c4f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:33.125204 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.125019 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-bundle" (OuterVolumeSpecName: "bundle") pod "81e83978-694c-40d6-b6d7-672c2b14b168" (UID: "81e83978-694c-40d6-b6d7-672c2b14b168"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:33.125204 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.125173 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-bundle" (OuterVolumeSpecName: "bundle") pod "fa84bb84-5411-451e-82bb-4de3b078eb23" (UID: "fa84bb84-5411-451e-82bb-4de3b078eb23"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:33.127321 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.127279 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e83978-694c-40d6-b6d7-672c2b14b168-kube-api-access-phqpg" (OuterVolumeSpecName: "kube-api-access-phqpg") pod "81e83978-694c-40d6-b6d7-672c2b14b168" (UID: "81e83978-694c-40d6-b6d7-672c2b14b168"). InnerVolumeSpecName "kube-api-access-phqpg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:00:33.127639 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.127614 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-kube-api-access-4qrjn" (OuterVolumeSpecName: "kube-api-access-4qrjn") pod "9b253b8b-4e0a-4890-a43a-c3f414b85c4f" (UID: "9b253b8b-4e0a-4890-a43a-c3f414b85c4f"). InnerVolumeSpecName "kube-api-access-4qrjn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:00:33.128170 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.128149 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa84bb84-5411-451e-82bb-4de3b078eb23-kube-api-access-bk8mg" (OuterVolumeSpecName: "kube-api-access-bk8mg") pod "fa84bb84-5411-451e-82bb-4de3b078eb23" (UID: "fa84bb84-5411-451e-82bb-4de3b078eb23"). InnerVolumeSpecName "kube-api-access-bk8mg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:00:33.132443 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.132416 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-util" (OuterVolumeSpecName: "util") pod "81e83978-694c-40d6-b6d7-672c2b14b168" (UID: "81e83978-694c-40d6-b6d7-672c2b14b168"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:33.132533 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.132477 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-util" (OuterVolumeSpecName: "util") pod "fa84bb84-5411-451e-82bb-4de3b078eb23" (UID: "fa84bb84-5411-451e-82bb-4de3b078eb23"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:33.132709 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.132688 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-util" (OuterVolumeSpecName: "util") pod "9b253b8b-4e0a-4890-a43a-c3f414b85c4f" (UID: "9b253b8b-4e0a-4890-a43a-c3f414b85c4f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:00:33.225912 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.225869 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phqpg\" (UniqueName: \"kubernetes.io/projected/81e83978-694c-40d6-b6d7-672c2b14b168-kube-api-access-phqpg\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.225912 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.225913 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-util\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.226155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.225928 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.226155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.225941 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-util\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.226155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.225954 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bk8mg\" (UniqueName: \"kubernetes.io/projected/fa84bb84-5411-451e-82bb-4de3b078eb23-kube-api-access-bk8mg\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.226155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.225968 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.226155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.225982 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4qrjn\" (UniqueName: \"kubernetes.io/projected/9b253b8b-4e0a-4890-a43a-c3f414b85c4f-kube-api-access-4qrjn\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.226155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.225996 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fa84bb84-5411-451e-82bb-4de3b078eb23-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.226155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.226010 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/81e83978-694c-40d6-b6d7-672c2b14b168-util\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:00:33.927841 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.927801 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" event={"ID":"fa84bb84-5411-451e-82bb-4de3b078eb23","Type":"ContainerDied","Data":"635d6cf3f583ecffe31fbb71d7b7bd0a8b02cb4b3b8d9358d4f3efe2339f5be5"} Apr 17 17:00:33.927841 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.927837 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="635d6cf3f583ecffe31fbb71d7b7bd0a8b02cb4b3b8d9358d4f3efe2339f5be5" Apr 17 17:00:33.928280 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.927867 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4" Apr 17 17:00:33.929582 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.929520 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" event={"ID":"9b253b8b-4e0a-4890-a43a-c3f414b85c4f","Type":"ContainerDied","Data":"c66af64d9119c91a89ac9aa176776ef505a800ffc2d50b5dbea0d1dbf6d7c5f7"} Apr 17 17:00:33.929582 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.929547 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c66af64d9119c91a89ac9aa176776ef505a800ffc2d50b5dbea0d1dbf6d7c5f7" Apr 17 17:00:33.929582 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.929559 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f" Apr 17 17:00:33.931636 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.931562 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" event={"ID":"81e83978-694c-40d6-b6d7-672c2b14b168","Type":"ContainerDied","Data":"02b5aacfb50f0fc4256cf2723edaae73cbe0401fbbc6346ee2706b107bd41e61"} Apr 17 17:00:33.931636 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.931590 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02b5aacfb50f0fc4256cf2723edaae73cbe0401fbbc6346ee2706b107bd41e61" Apr 17 17:00:33.931843 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:33.931712 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58" Apr 17 17:00:36.503916 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.503880 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv"] Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504204 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerName="pull" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504215 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerName="pull" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504227 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerName="util" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504232 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerName="util" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504239 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerName="extract" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504245 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerName="extract" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504253 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e83978-694c-40d6-b6d7-672c2b14b168" containerName="extract" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504258 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e83978-694c-40d6-b6d7-672c2b14b168" containerName="extract" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504266 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerName="pull" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504271 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerName="pull" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504277 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e83978-694c-40d6-b6d7-672c2b14b168" containerName="pull" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504282 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e83978-694c-40d6-b6d7-672c2b14b168" containerName="pull" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504289 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81e83978-694c-40d6-b6d7-672c2b14b168" containerName="util" Apr 17 17:00:36.504293 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504294 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e83978-694c-40d6-b6d7-672c2b14b168" containerName="util" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504302 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerName="extract" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504308 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerName="extract" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504318 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerName="util" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504323 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerName="util" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504330 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerName="extract" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504335 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerName="extract" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504341 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerName="util" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504346 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerName="util" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504352 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerName="pull" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504357 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerName="pull" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504403 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa84bb84-5411-451e-82bb-4de3b078eb23" containerName="extract" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504412 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="81e83978-694c-40d6-b6d7-672c2b14b168" containerName="extract" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504419 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b253b8b-4e0a-4890-a43a-c3f414b85c4f" containerName="extract" Apr 17 17:00:36.504714 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.504426 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf2b1a44-aabd-490e-8361-7f228a0882ff" containerName="extract" Apr 17 17:00:36.508868 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.508852 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" Apr 17 17:00:36.511535 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.511512 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 17:00:36.511639 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.511595 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-2k88h\"" Apr 17 17:00:36.517957 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.517934 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv"] Apr 17 17:00:36.551902 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.551869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcq7c\" (UniqueName: \"kubernetes.io/projected/028ce1af-38c0-4b4a-b042-373af0683c54-kube-api-access-vcq7c\") pod \"dns-operator-controller-manager-648d5c98bc-9ggsv\" (UID: \"028ce1af-38c0-4b4a-b042-373af0683c54\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" Apr 17 17:00:36.653300 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.653257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcq7c\" (UniqueName: \"kubernetes.io/projected/028ce1af-38c0-4b4a-b042-373af0683c54-kube-api-access-vcq7c\") pod \"dns-operator-controller-manager-648d5c98bc-9ggsv\" (UID: \"028ce1af-38c0-4b4a-b042-373af0683c54\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" Apr 17 17:00:36.672850 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.672817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcq7c\" (UniqueName: \"kubernetes.io/projected/028ce1af-38c0-4b4a-b042-373af0683c54-kube-api-access-vcq7c\") pod \"dns-operator-controller-manager-648d5c98bc-9ggsv\" (UID: \"028ce1af-38c0-4b4a-b042-373af0683c54\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" Apr 17 17:00:36.819552 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.819517 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" Apr 17 17:00:36.950176 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:36.950153 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv"] Apr 17 17:00:36.951980 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:36.951955 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod028ce1af_38c0_4b4a_b042_373af0683c54.slice/crio-07b9dda60a9eee222e65a7be04f41fe3c70dae3243c1305e17fbd6962e9dea65 WatchSource:0}: Error finding container 07b9dda60a9eee222e65a7be04f41fe3c70dae3243c1305e17fbd6962e9dea65: Status 404 returned error can't find the container with id 07b9dda60a9eee222e65a7be04f41fe3c70dae3243c1305e17fbd6962e9dea65 Apr 17 17:00:37.949899 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:37.949861 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" event={"ID":"028ce1af-38c0-4b4a-b042-373af0683c54","Type":"ContainerStarted","Data":"07b9dda60a9eee222e65a7be04f41fe3c70dae3243c1305e17fbd6962e9dea65"} Apr 17 17:00:39.958492 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:39.958401 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" event={"ID":"028ce1af-38c0-4b4a-b042-373af0683c54","Type":"ContainerStarted","Data":"4b866e3af94b8a3be736c5498c51037c053c06af0ada051d5cdf0127b4a508b2"} Apr 17 17:00:39.958913 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:39.958520 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" Apr 17 17:00:39.976815 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:39.976718 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" podStartSLOduration=1.382167761 podStartE2EDuration="3.976703187s" podCreationTimestamp="2026-04-17 17:00:36 +0000 UTC" firstStartedPulling="2026-04-17 17:00:36.95389199 +0000 UTC m=+476.353320503" lastFinishedPulling="2026-04-17 17:00:39.548427405 +0000 UTC m=+478.947855929" observedRunningTime="2026-04-17 17:00:39.975597774 +0000 UTC m=+479.375026306" watchObservedRunningTime="2026-04-17 17:00:39.976703187 +0000 UTC m=+479.376131719" Apr 17 17:00:41.853279 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.853242 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw"] Apr 17 17:00:41.857828 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.857809 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:41.860465 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.860438 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-p76pq\"" Apr 17 17:00:41.871292 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.871267 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw"] Apr 17 17:00:41.896175 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.896144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/11515584-4f92-4ddd-9358-6dda43fb5c30-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" (UID: \"11515584-4f92-4ddd-9358-6dda43fb5c30\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:41.896277 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.896214 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl2wm\" (UniqueName: \"kubernetes.io/projected/11515584-4f92-4ddd-9358-6dda43fb5c30-kube-api-access-dl2wm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" (UID: \"11515584-4f92-4ddd-9358-6dda43fb5c30\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:41.996923 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.996883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/11515584-4f92-4ddd-9358-6dda43fb5c30-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" (UID: \"11515584-4f92-4ddd-9358-6dda43fb5c30\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:41.997090 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.996941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl2wm\" (UniqueName: \"kubernetes.io/projected/11515584-4f92-4ddd-9358-6dda43fb5c30-kube-api-access-dl2wm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" (UID: \"11515584-4f92-4ddd-9358-6dda43fb5c30\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:41.997256 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:41.997233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/11515584-4f92-4ddd-9358-6dda43fb5c30-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" (UID: \"11515584-4f92-4ddd-9358-6dda43fb5c30\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:42.006549 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.006523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl2wm\" (UniqueName: \"kubernetes.io/projected/11515584-4f92-4ddd-9358-6dda43fb5c30-kube-api-access-dl2wm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" (UID: \"11515584-4f92-4ddd-9358-6dda43fb5c30\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:42.167247 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.167165 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:42.292349 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.292264 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw"] Apr 17 17:00:42.295214 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:42.295191 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11515584_4f92_4ddd_9358_6dda43fb5c30.slice/crio-226050fd5848024a7ad1e924fcdf2876f381e04c2891cf96ff162cb783f4583e WatchSource:0}: Error finding container 226050fd5848024a7ad1e924fcdf2876f381e04c2891cf96ff162cb783f4583e: Status 404 returned error can't find the container with id 226050fd5848024a7ad1e924fcdf2876f381e04c2891cf96ff162cb783f4583e Apr 17 17:00:42.678573 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.678532 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f76f76f7f-pn9z2"] Apr 17 17:00:42.687786 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.684980 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.694327 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.693151 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f76f76f7f-pn9z2"] Apr 17 17:00:42.702460 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.702437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-oauth-serving-cert\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.702562 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.702472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-console-config\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.702562 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.702488 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-service-ca\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.702562 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.702510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8d7c\" (UniqueName: \"kubernetes.io/projected/94dd7853-131d-41d7-bc81-3a34351d085f-kube-api-access-z8d7c\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.702692 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.702583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-trusted-ca-bundle\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.702692 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.702632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94dd7853-131d-41d7-bc81-3a34351d085f-console-serving-cert\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.702764 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.702699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94dd7853-131d-41d7-bc81-3a34351d085f-console-oauth-config\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.803365 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.803333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94dd7853-131d-41d7-bc81-3a34351d085f-console-oauth-config\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.803527 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.803379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-oauth-serving-cert\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.803603 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.803552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-console-config\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.803603 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.803583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-service-ca\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.803603 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.803603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8d7c\" (UniqueName: \"kubernetes.io/projected/94dd7853-131d-41d7-bc81-3a34351d085f-kube-api-access-z8d7c\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.803789 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.803646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-trusted-ca-bundle\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.803789 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.803712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94dd7853-131d-41d7-bc81-3a34351d085f-console-serving-cert\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.804103 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.804073 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-oauth-serving-cert\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.804280 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.804257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-console-config\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.804528 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.804257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-service-ca\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.804528 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.804488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94dd7853-131d-41d7-bc81-3a34351d085f-trusted-ca-bundle\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.806015 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.805995 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/94dd7853-131d-41d7-bc81-3a34351d085f-console-oauth-config\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.806175 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.806159 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/94dd7853-131d-41d7-bc81-3a34351d085f-console-serving-cert\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.819967 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.819920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8d7c\" (UniqueName: \"kubernetes.io/projected/94dd7853-131d-41d7-bc81-3a34351d085f-kube-api-access-z8d7c\") pod \"console-5f76f76f7f-pn9z2\" (UID: \"94dd7853-131d-41d7-bc81-3a34351d085f\") " pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:42.972030 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.971948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" event={"ID":"11515584-4f92-4ddd-9358-6dda43fb5c30","Type":"ContainerStarted","Data":"226050fd5848024a7ad1e924fcdf2876f381e04c2891cf96ff162cb783f4583e"} Apr 17 17:00:42.999250 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:42.999221 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:43.156065 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:43.156041 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f76f76f7f-pn9z2"] Apr 17 17:00:43.157889 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:43.157851 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94dd7853_131d_41d7_bc81_3a34351d085f.slice/crio-b9943f4572d450041e0ba553edb7592afd378aedb008464e2a0e3a447276f804 WatchSource:0}: Error finding container b9943f4572d450041e0ba553edb7592afd378aedb008464e2a0e3a447276f804: Status 404 returned error can't find the container with id b9943f4572d450041e0ba553edb7592afd378aedb008464e2a0e3a447276f804 Apr 17 17:00:43.977362 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:43.977324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f76f76f7f-pn9z2" event={"ID":"94dd7853-131d-41d7-bc81-3a34351d085f","Type":"ContainerStarted","Data":"df48afb8faa1190cda416686b539b78950a9935d9e6fad8bd631d21ccbc1297c"} Apr 17 17:00:43.977362 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:43.977362 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f76f76f7f-pn9z2" event={"ID":"94dd7853-131d-41d7-bc81-3a34351d085f","Type":"ContainerStarted","Data":"b9943f4572d450041e0ba553edb7592afd378aedb008464e2a0e3a447276f804"} Apr 17 17:00:44.004067 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:44.004013 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f76f76f7f-pn9z2" podStartSLOduration=2.003994112 podStartE2EDuration="2.003994112s" podCreationTimestamp="2026-04-17 17:00:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:00:44.002385694 +0000 UTC m=+483.401814227" watchObservedRunningTime="2026-04-17 17:00:44.003994112 +0000 UTC m=+483.403422645" Apr 17 17:00:47.993306 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:47.993272 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" event={"ID":"11515584-4f92-4ddd-9358-6dda43fb5c30","Type":"ContainerStarted","Data":"c9741b15b90fb1c2652f6d78f09bfcb0c9daa374a4c44a55417fd8cbe502e7ee"} Apr 17 17:00:47.993695 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:47.993420 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:00:48.013360 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.013308 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" podStartSLOduration=1.889925661 podStartE2EDuration="7.013291982s" podCreationTimestamp="2026-04-17 17:00:41 +0000 UTC" firstStartedPulling="2026-04-17 17:00:42.297713599 +0000 UTC m=+481.697142112" lastFinishedPulling="2026-04-17 17:00:47.421079923 +0000 UTC m=+486.820508433" observedRunningTime="2026-04-17 17:00:48.011731454 +0000 UTC m=+487.411159990" watchObservedRunningTime="2026-04-17 17:00:48.013291982 +0000 UTC m=+487.412720513" Apr 17 17:00:48.671846 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.671811 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hbw8b"] Apr 17 17:00:48.675409 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.675394 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" Apr 17 17:00:48.677990 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.677964 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-mgxq7\"" Apr 17 17:00:48.684698 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.684646 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hbw8b"] Apr 17 17:00:48.757012 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.756984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rfjf\" (UniqueName: \"kubernetes.io/projected/af451ffc-f75b-48b4-abbf-9bd33991bcdc-kube-api-access-6rfjf\") pod \"authorino-operator-657f44b778-hbw8b\" (UID: \"af451ffc-f75b-48b4-abbf-9bd33991bcdc\") " pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" Apr 17 17:00:48.857945 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.857898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rfjf\" (UniqueName: \"kubernetes.io/projected/af451ffc-f75b-48b4-abbf-9bd33991bcdc-kube-api-access-6rfjf\") pod \"authorino-operator-657f44b778-hbw8b\" (UID: \"af451ffc-f75b-48b4-abbf-9bd33991bcdc\") " pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" Apr 17 17:00:48.866563 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.866542 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rfjf\" (UniqueName: \"kubernetes.io/projected/af451ffc-f75b-48b4-abbf-9bd33991bcdc-kube-api-access-6rfjf\") pod \"authorino-operator-657f44b778-hbw8b\" (UID: \"af451ffc-f75b-48b4-abbf-9bd33991bcdc\") " pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" Apr 17 17:00:48.986933 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:48.986860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" Apr 17 17:00:49.319348 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:49.319325 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-hbw8b"] Apr 17 17:00:49.321232 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:00:49.321202 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf451ffc_f75b_48b4_abbf_9bd33991bcdc.slice/crio-e8208122717f76b1e3fb6f54844e428aff8c9c3aa031cac70d9d06a10da53c32 WatchSource:0}: Error finding container e8208122717f76b1e3fb6f54844e428aff8c9c3aa031cac70d9d06a10da53c32: Status 404 returned error can't find the container with id e8208122717f76b1e3fb6f54844e428aff8c9c3aa031cac70d9d06a10da53c32 Apr 17 17:00:50.002223 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:50.002188 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" event={"ID":"af451ffc-f75b-48b4-abbf-9bd33991bcdc","Type":"ContainerStarted","Data":"e8208122717f76b1e3fb6f54844e428aff8c9c3aa031cac70d9d06a10da53c32"} Apr 17 17:00:50.965005 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:50.964974 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9ggsv" Apr 17 17:00:52.011623 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:52.011585 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" event={"ID":"af451ffc-f75b-48b4-abbf-9bd33991bcdc","Type":"ContainerStarted","Data":"c6f70ea2233cbc0dbc5cb59fe3f4f87b923b83dbedc922fc5f9008f6748d410e"} Apr 17 17:00:52.011980 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:52.011638 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" Apr 17 17:00:52.030389 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:52.030340 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" podStartSLOduration=1.8804838419999999 podStartE2EDuration="4.030327885s" podCreationTimestamp="2026-04-17 17:00:48 +0000 UTC" firstStartedPulling="2026-04-17 17:00:49.323185258 +0000 UTC m=+488.722613767" lastFinishedPulling="2026-04-17 17:00:51.473029297 +0000 UTC m=+490.872457810" observedRunningTime="2026-04-17 17:00:52.029573075 +0000 UTC m=+491.429001611" watchObservedRunningTime="2026-04-17 17:00:52.030327885 +0000 UTC m=+491.429756418" Apr 17 17:00:52.999616 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:52.999578 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:52.999807 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:52.999632 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:53.004117 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:53.004095 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:53.018892 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:53.018869 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f76f76f7f-pn9z2" Apr 17 17:00:53.066425 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:53.066029 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c5f89984-bqc4r"] Apr 17 17:00:58.999471 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:00:58.999442 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:01:00.844907 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.844872 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw"] Apr 17 17:01:00.845307 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.845083 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" containerName="manager" containerID="cri-o://c9741b15b90fb1c2652f6d78f09bfcb0c9daa374a4c44a55417fd8cbe502e7ee" gracePeriod=2 Apr 17 17:01:00.852893 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.852865 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw"] Apr 17 17:01:00.869805 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.869777 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2"] Apr 17 17:01:00.870155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.870142 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" containerName="manager" Apr 17 17:01:00.870198 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.870158 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" containerName="manager" Apr 17 17:01:00.870233 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.870221 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" containerName="manager" Apr 17 17:01:00.873417 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.873395 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:00.876954 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.876927 2574 status_manager.go:895] "Failed to get status for pod" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" is forbidden: User \"system:node:ip-10-0-138-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-47.ec2.internal' and this object" Apr 17 17:01:00.899187 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.899126 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2"] Apr 17 17:01:00.901797 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.901771 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8"] Apr 17 17:01:00.905290 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.905273 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:00.934390 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.934364 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8"] Apr 17 17:01:00.943090 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.943062 2574 status_manager.go:895] "Failed to get status for pod" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" is forbidden: User \"system:node:ip-10-0-138-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-47.ec2.internal' and this object" Apr 17 17:01:00.965107 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.965079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6jzr\" (UniqueName: \"kubernetes.io/projected/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-kube-api-access-b6jzr\") pod \"kuadrant-operator-controller-manager-84b657d985-dqhz8\" (UID: \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:00.965201 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.965161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-dqhz8\" (UID: \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:00.965253 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.965227 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/47364fdc-71ae-400e-aaa5-6c79044da5db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dpmm2\" (UID: \"47364fdc-71ae-400e-aaa5-6c79044da5db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:00.965292 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:00.965281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj2lv\" (UniqueName: \"kubernetes.io/projected/47364fdc-71ae-400e-aaa5-6c79044da5db-kube-api-access-jj2lv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dpmm2\" (UID: \"47364fdc-71ae-400e-aaa5-6c79044da5db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:01.044087 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.044057 2574 generic.go:358] "Generic (PLEG): container finished" podID="11515584-4f92-4ddd-9358-6dda43fb5c30" containerID="c9741b15b90fb1c2652f6d78f09bfcb0c9daa374a4c44a55417fd8cbe502e7ee" exitCode=0 Apr 17 17:01:01.066485 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.066457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6jzr\" (UniqueName: \"kubernetes.io/projected/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-kube-api-access-b6jzr\") pod \"kuadrant-operator-controller-manager-84b657d985-dqhz8\" (UID: \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:01.066581 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.066532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-dqhz8\" (UID: \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:01.066673 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.066595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/47364fdc-71ae-400e-aaa5-6c79044da5db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dpmm2\" (UID: \"47364fdc-71ae-400e-aaa5-6c79044da5db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:01.066736 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.066646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jj2lv\" (UniqueName: \"kubernetes.io/projected/47364fdc-71ae-400e-aaa5-6c79044da5db-kube-api-access-jj2lv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dpmm2\" (UID: \"47364fdc-71ae-400e-aaa5-6c79044da5db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:01.066984 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.066960 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-dqhz8\" (UID: \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:01.067077 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.067059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/47364fdc-71ae-400e-aaa5-6c79044da5db-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dpmm2\" (UID: \"47364fdc-71ae-400e-aaa5-6c79044da5db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:01.078058 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.078039 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:01:01.080715 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.080691 2574 status_manager.go:895] "Failed to get status for pod" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" is forbidden: User \"system:node:ip-10-0-138-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-47.ec2.internal' and this object" Apr 17 17:01:01.084011 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.083987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6jzr\" (UniqueName: \"kubernetes.io/projected/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-kube-api-access-b6jzr\") pod \"kuadrant-operator-controller-manager-84b657d985-dqhz8\" (UID: \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:01.084095 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.084076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj2lv\" (UniqueName: \"kubernetes.io/projected/47364fdc-71ae-400e-aaa5-6c79044da5db-kube-api-access-jj2lv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dpmm2\" (UID: \"47364fdc-71ae-400e-aaa5-6c79044da5db\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:01.167049 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.166960 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/11515584-4f92-4ddd-9358-6dda43fb5c30-extensions-socket-volume\") pod \"11515584-4f92-4ddd-9358-6dda43fb5c30\" (UID: \"11515584-4f92-4ddd-9358-6dda43fb5c30\") " Apr 17 17:01:01.167049 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.167021 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl2wm\" (UniqueName: \"kubernetes.io/projected/11515584-4f92-4ddd-9358-6dda43fb5c30-kube-api-access-dl2wm\") pod \"11515584-4f92-4ddd-9358-6dda43fb5c30\" (UID: \"11515584-4f92-4ddd-9358-6dda43fb5c30\") " Apr 17 17:01:01.167573 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.167538 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11515584-4f92-4ddd-9358-6dda43fb5c30-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "11515584-4f92-4ddd-9358-6dda43fb5c30" (UID: "11515584-4f92-4ddd-9358-6dda43fb5c30"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:01:01.168292 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.168267 2574 status_manager.go:895] "Failed to get status for pod" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-598sw\" is forbidden: User \"system:node:ip-10-0-138-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-138-47.ec2.internal' and this object" Apr 17 17:01:01.169255 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.169234 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11515584-4f92-4ddd-9358-6dda43fb5c30-kube-api-access-dl2wm" (OuterVolumeSpecName: "kube-api-access-dl2wm") pod "11515584-4f92-4ddd-9358-6dda43fb5c30" (UID: "11515584-4f92-4ddd-9358-6dda43fb5c30"). InnerVolumeSpecName "kube-api-access-dl2wm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:01:01.229345 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.229311 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:01.238094 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.236315 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:01.267742 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.267687 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/11515584-4f92-4ddd-9358-6dda43fb5c30-extensions-socket-volume\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:01.267742 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.267716 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dl2wm\" (UniqueName: \"kubernetes.io/projected/11515584-4f92-4ddd-9358-6dda43fb5c30-kube-api-access-dl2wm\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:01.371981 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.371955 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2"] Apr 17 17:01:01.374286 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:01:01.374254 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47364fdc_71ae_400e_aaa5_6c79044da5db.slice/crio-446869fee5817d9a35196c7b068e1a99efd211037752845b29d29163ca7e5b3e WatchSource:0}: Error finding container 446869fee5817d9a35196c7b068e1a99efd211037752845b29d29163ca7e5b3e: Status 404 returned error can't find the container with id 446869fee5817d9a35196c7b068e1a99efd211037752845b29d29163ca7e5b3e Apr 17 17:01:01.407266 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:01.407245 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8"] Apr 17 17:01:01.409739 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:01:01.409712 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec03e77c_2ab2_4a0d_9042_bde76e7425f4.slice/crio-a10a1759ba6abb251b3629850e66f609e1c9141660cfdff8872faabe9cda908d WatchSource:0}: Error finding container a10a1759ba6abb251b3629850e66f609e1c9141660cfdff8872faabe9cda908d: Status 404 returned error can't find the container with id a10a1759ba6abb251b3629850e66f609e1c9141660cfdff8872faabe9cda908d Apr 17 17:01:02.049333 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.049282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" event={"ID":"47364fdc-71ae-400e-aaa5-6c79044da5db","Type":"ContainerStarted","Data":"38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea"} Apr 17 17:01:02.049333 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.049333 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:02.049850 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.049348 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" event={"ID":"47364fdc-71ae-400e-aaa5-6c79044da5db","Type":"ContainerStarted","Data":"446869fee5817d9a35196c7b068e1a99efd211037752845b29d29163ca7e5b3e"} Apr 17 17:01:02.050632 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.050611 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" event={"ID":"ec03e77c-2ab2-4a0d-9042-bde76e7425f4","Type":"ContainerStarted","Data":"caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8"} Apr 17 17:01:02.050761 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.050637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" event={"ID":"ec03e77c-2ab2-4a0d-9042-bde76e7425f4","Type":"ContainerStarted","Data":"a10a1759ba6abb251b3629850e66f609e1c9141660cfdff8872faabe9cda908d"} Apr 17 17:01:02.050761 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.050713 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:02.051736 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.051720 2574 scope.go:117] "RemoveContainer" containerID="c9741b15b90fb1c2652f6d78f09bfcb0c9daa374a4c44a55417fd8cbe502e7ee" Apr 17 17:01:02.051736 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.051729 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-598sw" Apr 17 17:01:02.076013 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.071971 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" podStartSLOduration=2.071953493 podStartE2EDuration="2.071953493s" podCreationTimestamp="2026-04-17 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:01:02.068233418 +0000 UTC m=+501.467661951" watchObservedRunningTime="2026-04-17 17:01:02.071953493 +0000 UTC m=+501.471382028" Apr 17 17:01:02.086606 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:02.086552 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" podStartSLOduration=2.086540142 podStartE2EDuration="2.086540142s" podCreationTimestamp="2026-04-17 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:01:02.086215648 +0000 UTC m=+501.485644180" watchObservedRunningTime="2026-04-17 17:01:02.086540142 +0000 UTC m=+501.485968673" Apr 17 17:01:03.017086 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:03.017051 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-hbw8b" Apr 17 17:01:03.166837 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:03.166799 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11515584-4f92-4ddd-9358-6dda43fb5c30" path="/var/lib/kubelet/pods/11515584-4f92-4ddd-9358-6dda43fb5c30/volumes" Apr 17 17:01:13.058904 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.058875 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:13.059316 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.059159 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:13.139443 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.139411 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2"] Apr 17 17:01:13.139716 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.139668 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" podUID="47364fdc-71ae-400e-aaa5-6c79044da5db" containerName="manager" containerID="cri-o://38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea" gracePeriod=10 Apr 17 17:01:13.390095 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.390066 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:13.471209 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.471175 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6"] Apr 17 17:01:13.471561 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.471547 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47364fdc-71ae-400e-aaa5-6c79044da5db" containerName="manager" Apr 17 17:01:13.471561 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.471563 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="47364fdc-71ae-400e-aaa5-6c79044da5db" containerName="manager" Apr 17 17:01:13.471683 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.471611 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="47364fdc-71ae-400e-aaa5-6c79044da5db" containerName="manager" Apr 17 17:01:13.475259 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.475242 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:13.475679 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.475621 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/47364fdc-71ae-400e-aaa5-6c79044da5db-extensions-socket-volume\") pod \"47364fdc-71ae-400e-aaa5-6c79044da5db\" (UID: \"47364fdc-71ae-400e-aaa5-6c79044da5db\") " Apr 17 17:01:13.475805 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.475687 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj2lv\" (UniqueName: \"kubernetes.io/projected/47364fdc-71ae-400e-aaa5-6c79044da5db-kube-api-access-jj2lv\") pod \"47364fdc-71ae-400e-aaa5-6c79044da5db\" (UID: \"47364fdc-71ae-400e-aaa5-6c79044da5db\") " Apr 17 17:01:13.475981 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.475961 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47364fdc-71ae-400e-aaa5-6c79044da5db-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "47364fdc-71ae-400e-aaa5-6c79044da5db" (UID: "47364fdc-71ae-400e-aaa5-6c79044da5db"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:01:13.477791 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.477773 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47364fdc-71ae-400e-aaa5-6c79044da5db-kube-api-access-jj2lv" (OuterVolumeSpecName: "kube-api-access-jj2lv") pod "47364fdc-71ae-400e-aaa5-6c79044da5db" (UID: "47364fdc-71ae-400e-aaa5-6c79044da5db"). InnerVolumeSpecName "kube-api-access-jj2lv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:01:13.487565 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.487531 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6"] Apr 17 17:01:13.577008 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.576981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e72a6c48-194c-468e-8cb9-befda0e033d7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-ftzl6\" (UID: \"e72a6c48-194c-468e-8cb9-befda0e033d7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:13.577180 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.577016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgpzf\" (UniqueName: \"kubernetes.io/projected/e72a6c48-194c-468e-8cb9-befda0e033d7-kube-api-access-bgpzf\") pod \"kuadrant-operator-controller-manager-55c7f4c975-ftzl6\" (UID: \"e72a6c48-194c-468e-8cb9-befda0e033d7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:13.577180 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.577114 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/47364fdc-71ae-400e-aaa5-6c79044da5db-extensions-socket-volume\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:13.577180 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.577124 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jj2lv\" (UniqueName: \"kubernetes.io/projected/47364fdc-71ae-400e-aaa5-6c79044da5db-kube-api-access-jj2lv\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:13.679216 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.679115 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e72a6c48-194c-468e-8cb9-befda0e033d7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-ftzl6\" (UID: \"e72a6c48-194c-468e-8cb9-befda0e033d7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:13.679216 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.679188 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgpzf\" (UniqueName: \"kubernetes.io/projected/e72a6c48-194c-468e-8cb9-befda0e033d7-kube-api-access-bgpzf\") pod \"kuadrant-operator-controller-manager-55c7f4c975-ftzl6\" (UID: \"e72a6c48-194c-468e-8cb9-befda0e033d7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:13.679987 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.679955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e72a6c48-194c-468e-8cb9-befda0e033d7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-ftzl6\" (UID: \"e72a6c48-194c-468e-8cb9-befda0e033d7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:13.690726 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.690698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgpzf\" (UniqueName: \"kubernetes.io/projected/e72a6c48-194c-468e-8cb9-befda0e033d7-kube-api-access-bgpzf\") pod \"kuadrant-operator-controller-manager-55c7f4c975-ftzl6\" (UID: \"e72a6c48-194c-468e-8cb9-befda0e033d7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:13.792500 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.792462 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:13.939427 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:13.939399 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6"] Apr 17 17:01:13.940887 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:01:13.940861 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode72a6c48_194c_468e_8cb9_befda0e033d7.slice/crio-f1512ba10cda4b757b370f206aeb624681e1af458a396f832f8c626de19785a6 WatchSource:0}: Error finding container f1512ba10cda4b757b370f206aeb624681e1af458a396f832f8c626de19785a6: Status 404 returned error can't find the container with id f1512ba10cda4b757b370f206aeb624681e1af458a396f832f8c626de19785a6 Apr 17 17:01:14.100701 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.100649 2574 generic.go:358] "Generic (PLEG): container finished" podID="47364fdc-71ae-400e-aaa5-6c79044da5db" containerID="38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea" exitCode=0 Apr 17 17:01:14.101133 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.100741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" event={"ID":"47364fdc-71ae-400e-aaa5-6c79044da5db","Type":"ContainerDied","Data":"38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea"} Apr 17 17:01:14.101133 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.100758 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" Apr 17 17:01:14.101133 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.100772 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2" event={"ID":"47364fdc-71ae-400e-aaa5-6c79044da5db","Type":"ContainerDied","Data":"446869fee5817d9a35196c7b068e1a99efd211037752845b29d29163ca7e5b3e"} Apr 17 17:01:14.101133 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.100795 2574 scope.go:117] "RemoveContainer" containerID="38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea" Apr 17 17:01:14.102476 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.102446 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" event={"ID":"e72a6c48-194c-468e-8cb9-befda0e033d7","Type":"ContainerStarted","Data":"19af22bc0b63feeb0e0c29dbfc9175dd26b89a54afe5f2f4a2887692db091be0"} Apr 17 17:01:14.102564 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.102478 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" event={"ID":"e72a6c48-194c-468e-8cb9-befda0e033d7","Type":"ContainerStarted","Data":"f1512ba10cda4b757b370f206aeb624681e1af458a396f832f8c626de19785a6"} Apr 17 17:01:14.102564 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.102536 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:14.109608 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.109590 2574 scope.go:117] "RemoveContainer" containerID="38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea" Apr 17 17:01:14.109907 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:01:14.109889 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea\": container with ID starting with 38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea not found: ID does not exist" containerID="38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea" Apr 17 17:01:14.109956 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.109915 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea"} err="failed to get container status \"38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea\": rpc error: code = NotFound desc = could not find container \"38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea\": container with ID starting with 38e43b04296d1c57afb526741514198de4a519e0dc1663a1393787c9b53a56ea not found: ID does not exist" Apr 17 17:01:14.126824 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.126784 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" podStartSLOduration=1.126769973 podStartE2EDuration="1.126769973s" podCreationTimestamp="2026-04-17 17:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:01:14.125846401 +0000 UTC m=+513.525274928" watchObservedRunningTime="2026-04-17 17:01:14.126769973 +0000 UTC m=+513.526198506" Apr 17 17:01:14.142275 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.142249 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2"] Apr 17 17:01:14.147752 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:14.147735 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dpmm2"] Apr 17 17:01:15.167326 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:15.167298 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47364fdc-71ae-400e-aaa5-6c79044da5db" path="/var/lib/kubelet/pods/47364fdc-71ae-400e-aaa5-6c79044da5db/volumes" Apr 17 17:01:18.091717 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.091638 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c5f89984-bqc4r" podUID="a0e367d4-6a64-4dbd-aa5d-511e12497f79" containerName="console" containerID="cri-o://a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13" gracePeriod=15 Apr 17 17:01:18.346053 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.346002 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c5f89984-bqc4r_a0e367d4-6a64-4dbd-aa5d-511e12497f79/console/0.log" Apr 17 17:01:18.346156 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.346060 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 17:01:18.519195 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519163 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-serving-cert\") pod \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " Apr 17 17:01:18.519370 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519226 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-oauth-serving-cert\") pod \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " Apr 17 17:01:18.519370 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519250 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-config\") pod \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " Apr 17 17:01:18.519370 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519268 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-769qt\" (UniqueName: \"kubernetes.io/projected/a0e367d4-6a64-4dbd-aa5d-511e12497f79-kube-api-access-769qt\") pod \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " Apr 17 17:01:18.519370 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519292 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-service-ca\") pod \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " Apr 17 17:01:18.519370 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519314 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-oauth-config\") pod \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " Apr 17 17:01:18.519370 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519341 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-trusted-ca-bundle\") pod \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\" (UID: \"a0e367d4-6a64-4dbd-aa5d-511e12497f79\") " Apr 17 17:01:18.519889 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519777 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a0e367d4-6a64-4dbd-aa5d-511e12497f79" (UID: "a0e367d4-6a64-4dbd-aa5d-511e12497f79"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:01:18.520006 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519788 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-service-ca" (OuterVolumeSpecName: "service-ca") pod "a0e367d4-6a64-4dbd-aa5d-511e12497f79" (UID: "a0e367d4-6a64-4dbd-aa5d-511e12497f79"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:01:18.520006 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519775 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-config" (OuterVolumeSpecName: "console-config") pod "a0e367d4-6a64-4dbd-aa5d-511e12497f79" (UID: "a0e367d4-6a64-4dbd-aa5d-511e12497f79"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:01:18.520006 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.519962 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a0e367d4-6a64-4dbd-aa5d-511e12497f79" (UID: "a0e367d4-6a64-4dbd-aa5d-511e12497f79"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:01:18.521716 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.521691 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e367d4-6a64-4dbd-aa5d-511e12497f79-kube-api-access-769qt" (OuterVolumeSpecName: "kube-api-access-769qt") pod "a0e367d4-6a64-4dbd-aa5d-511e12497f79" (UID: "a0e367d4-6a64-4dbd-aa5d-511e12497f79"). InnerVolumeSpecName "kube-api-access-769qt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:01:18.522164 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.522141 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a0e367d4-6a64-4dbd-aa5d-511e12497f79" (UID: "a0e367d4-6a64-4dbd-aa5d-511e12497f79"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:01:18.522270 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.522214 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a0e367d4-6a64-4dbd-aa5d-511e12497f79" (UID: "a0e367d4-6a64-4dbd-aa5d-511e12497f79"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:01:18.620304 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.620218 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-serving-cert\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:18.620304 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.620244 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-oauth-serving-cert\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:18.620304 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.620254 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-config\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:18.620304 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.620267 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-769qt\" (UniqueName: \"kubernetes.io/projected/a0e367d4-6a64-4dbd-aa5d-511e12497f79-kube-api-access-769qt\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:18.620304 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.620276 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-service-ca\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:18.620304 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.620286 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0e367d4-6a64-4dbd-aa5d-511e12497f79-console-oauth-config\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:18.620304 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:18.620296 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e367d4-6a64-4dbd-aa5d-511e12497f79-trusted-ca-bundle\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:19.122911 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.122888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c5f89984-bqc4r_a0e367d4-6a64-4dbd-aa5d-511e12497f79/console/0.log" Apr 17 17:01:19.123330 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.122926 2574 generic.go:358] "Generic (PLEG): container finished" podID="a0e367d4-6a64-4dbd-aa5d-511e12497f79" containerID="a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13" exitCode=2 Apr 17 17:01:19.123330 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.122965 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c5f89984-bqc4r" event={"ID":"a0e367d4-6a64-4dbd-aa5d-511e12497f79","Type":"ContainerDied","Data":"a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13"} Apr 17 17:01:19.123330 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.122986 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c5f89984-bqc4r" event={"ID":"a0e367d4-6a64-4dbd-aa5d-511e12497f79","Type":"ContainerDied","Data":"a23963ea41c7db66c9da0fbcde418425d7a9b67f69232de52e4ee429362eb71b"} Apr 17 17:01:19.123330 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.122991 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c5f89984-bqc4r" Apr 17 17:01:19.123330 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.122999 2574 scope.go:117] "RemoveContainer" containerID="a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13" Apr 17 17:01:19.131830 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.131814 2574 scope.go:117] "RemoveContainer" containerID="a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13" Apr 17 17:01:19.132051 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:01:19.132033 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13\": container with ID starting with a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13 not found: ID does not exist" containerID="a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13" Apr 17 17:01:19.132104 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.132069 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13"} err="failed to get container status \"a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13\": rpc error: code = NotFound desc = could not find container \"a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13\": container with ID starting with a1427f7f68f56967700106a800f336096fc1a2297b1566790827f5c601a24d13 not found: ID does not exist" Apr 17 17:01:19.147835 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.147810 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c5f89984-bqc4r"] Apr 17 17:01:19.151784 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.151763 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c5f89984-bqc4r"] Apr 17 17:01:19.167504 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:19.167475 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e367d4-6a64-4dbd-aa5d-511e12497f79" path="/var/lib/kubelet/pods/a0e367d4-6a64-4dbd-aa5d-511e12497f79/volumes" Apr 17 17:01:25.109226 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.109191 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ftzl6" Apr 17 17:01:25.167671 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.167622 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8"] Apr 17 17:01:25.167923 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.167898 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" podUID="ec03e77c-2ab2-4a0d-9042-bde76e7425f4" containerName="manager" containerID="cri-o://caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8" gracePeriod=10 Apr 17 17:01:25.410960 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.410939 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:25.580481 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.580427 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6jzr\" (UniqueName: \"kubernetes.io/projected/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-kube-api-access-b6jzr\") pod \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\" (UID: \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\") " Apr 17 17:01:25.580481 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.580495 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-extensions-socket-volume\") pod \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\" (UID: \"ec03e77c-2ab2-4a0d-9042-bde76e7425f4\") " Apr 17 17:01:25.580924 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.580888 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "ec03e77c-2ab2-4a0d-9042-bde76e7425f4" (UID: "ec03e77c-2ab2-4a0d-9042-bde76e7425f4"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:01:25.582804 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.582780 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-kube-api-access-b6jzr" (OuterVolumeSpecName: "kube-api-access-b6jzr") pod "ec03e77c-2ab2-4a0d-9042-bde76e7425f4" (UID: "ec03e77c-2ab2-4a0d-9042-bde76e7425f4"). InnerVolumeSpecName "kube-api-access-b6jzr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:01:25.681931 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.681857 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6jzr\" (UniqueName: \"kubernetes.io/projected/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-kube-api-access-b6jzr\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:25.681931 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:25.681886 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ec03e77c-2ab2-4a0d-9042-bde76e7425f4-extensions-socket-volume\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:26.150884 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.150854 2574 generic.go:358] "Generic (PLEG): container finished" podID="ec03e77c-2ab2-4a0d-9042-bde76e7425f4" containerID="caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8" exitCode=0 Apr 17 17:01:26.151286 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.150914 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" Apr 17 17:01:26.151286 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.150945 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" event={"ID":"ec03e77c-2ab2-4a0d-9042-bde76e7425f4","Type":"ContainerDied","Data":"caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8"} Apr 17 17:01:26.151286 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.150984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8" event={"ID":"ec03e77c-2ab2-4a0d-9042-bde76e7425f4","Type":"ContainerDied","Data":"a10a1759ba6abb251b3629850e66f609e1c9141660cfdff8872faabe9cda908d"} Apr 17 17:01:26.151286 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.150999 2574 scope.go:117] "RemoveContainer" containerID="caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8" Apr 17 17:01:26.162884 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.162827 2574 scope.go:117] "RemoveContainer" containerID="caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8" Apr 17 17:01:26.163261 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:01:26.163192 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8\": container with ID starting with caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8 not found: ID does not exist" containerID="caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8" Apr 17 17:01:26.163352 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.163271 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8"} err="failed to get container status \"caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8\": rpc error: code = NotFound desc = could not find container \"caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8\": container with ID starting with caaad6d57472b5a38da040b38b0f3f87ddeed03ee02df859a3619d102646a4a8 not found: ID does not exist" Apr 17 17:01:26.186019 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.185992 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8"] Apr 17 17:01:26.192669 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:26.192631 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-dqhz8"] Apr 17 17:01:27.166427 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:27.166391 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec03e77c-2ab2-4a0d-9042-bde76e7425f4" path="/var/lib/kubelet/pods/ec03e77c-2ab2-4a0d-9042-bde76e7425f4/volumes" Apr 17 17:01:45.091529 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.091491 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-z949k"] Apr 17 17:01:45.091941 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.091878 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0e367d4-6a64-4dbd-aa5d-511e12497f79" containerName="console" Apr 17 17:01:45.091941 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.091892 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e367d4-6a64-4dbd-aa5d-511e12497f79" containerName="console" Apr 17 17:01:45.091941 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.091915 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec03e77c-2ab2-4a0d-9042-bde76e7425f4" containerName="manager" Apr 17 17:01:45.091941 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.091923 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec03e77c-2ab2-4a0d-9042-bde76e7425f4" containerName="manager" Apr 17 17:01:45.092106 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.092024 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0e367d4-6a64-4dbd-aa5d-511e12497f79" containerName="console" Apr 17 17:01:45.092106 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.092044 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec03e77c-2ab2-4a0d-9042-bde76e7425f4" containerName="manager" Apr 17 17:01:45.096587 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.096569 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:45.099423 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.099404 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-rt5sd\"" Apr 17 17:01:45.099566 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.099446 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 17:01:45.103960 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.103736 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-z949k"] Apr 17 17:01:45.146946 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.146903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-config-file\") pod \"limitador-limitador-7d549b5b-z949k\" (UID: \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:45.147113 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.146957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj54s\" (UniqueName: \"kubernetes.io/projected/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-kube-api-access-zj54s\") pod \"limitador-limitador-7d549b5b-z949k\" (UID: \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:45.194549 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.194508 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-z949k"] Apr 17 17:01:45.248402 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.248374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-config-file\") pod \"limitador-limitador-7d549b5b-z949k\" (UID: \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:45.248586 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.248410 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj54s\" (UniqueName: \"kubernetes.io/projected/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-kube-api-access-zj54s\") pod \"limitador-limitador-7d549b5b-z949k\" (UID: \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:45.249011 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.248992 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-config-file\") pod \"limitador-limitador-7d549b5b-z949k\" (UID: \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:45.260357 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.260335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj54s\" (UniqueName: \"kubernetes.io/projected/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-kube-api-access-zj54s\") pod \"limitador-limitador-7d549b5b-z949k\" (UID: \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:45.409055 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.408966 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:45.559816 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:45.559789 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-z949k"] Apr 17 17:01:45.561688 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:01:45.561636 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff5dec6e_8ac9_4494_b3ec_44716ca1bf5a.slice/crio-0edd2960b831fc6714ac7fd3ebc93b60d38d70bfbf146c8bb76585660471ded5 WatchSource:0}: Error finding container 0edd2960b831fc6714ac7fd3ebc93b60d38d70bfbf146c8bb76585660471ded5: Status 404 returned error can't find the container with id 0edd2960b831fc6714ac7fd3ebc93b60d38d70bfbf146c8bb76585660471ded5 Apr 17 17:01:46.116176 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.116139 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-25jhs"] Apr 17 17:01:46.119636 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.119615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" Apr 17 17:01:46.122479 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.122459 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wrrmf\"" Apr 17 17:01:46.124323 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.124300 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-25jhs"] Apr 17 17:01:46.155547 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.155519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnn2l\" (UniqueName: \"kubernetes.io/projected/d070a7de-82e2-4294-9f98-cc2f79b384a2-kube-api-access-nnn2l\") pod \"authorino-f99f4b5cd-25jhs\" (UID: \"d070a7de-82e2-4294-9f98-cc2f79b384a2\") " pod="kuadrant-system/authorino-f99f4b5cd-25jhs" Apr 17 17:01:46.222399 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.222362 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" event={"ID":"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a","Type":"ContainerStarted","Data":"0edd2960b831fc6714ac7fd3ebc93b60d38d70bfbf146c8bb76585660471ded5"} Apr 17 17:01:46.257074 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.257037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnn2l\" (UniqueName: \"kubernetes.io/projected/d070a7de-82e2-4294-9f98-cc2f79b384a2-kube-api-access-nnn2l\") pod \"authorino-f99f4b5cd-25jhs\" (UID: \"d070a7de-82e2-4294-9f98-cc2f79b384a2\") " pod="kuadrant-system/authorino-f99f4b5cd-25jhs" Apr 17 17:01:46.267970 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.267944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnn2l\" (UniqueName: \"kubernetes.io/projected/d070a7de-82e2-4294-9f98-cc2f79b384a2-kube-api-access-nnn2l\") pod \"authorino-f99f4b5cd-25jhs\" (UID: \"d070a7de-82e2-4294-9f98-cc2f79b384a2\") " pod="kuadrant-system/authorino-f99f4b5cd-25jhs" Apr 17 17:01:46.431024 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.430933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" Apr 17 17:01:46.575176 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:46.575143 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-25jhs"] Apr 17 17:01:46.578581 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:01:46.578557 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd070a7de_82e2_4294_9f98_cc2f79b384a2.slice/crio-3a824a5556cf9590e81a63a01ba7aaafaf68918b09b61ff817513361098ce720 WatchSource:0}: Error finding container 3a824a5556cf9590e81a63a01ba7aaafaf68918b09b61ff817513361098ce720: Status 404 returned error can't find the container with id 3a824a5556cf9590e81a63a01ba7aaafaf68918b09b61ff817513361098ce720 Apr 17 17:01:47.227850 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:47.227817 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" event={"ID":"d070a7de-82e2-4294-9f98-cc2f79b384a2","Type":"ContainerStarted","Data":"3a824a5556cf9590e81a63a01ba7aaafaf68918b09b61ff817513361098ce720"} Apr 17 17:01:50.241416 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:50.241318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" event={"ID":"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a","Type":"ContainerStarted","Data":"a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35"} Apr 17 17:01:50.241877 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:50.241452 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:01:50.242631 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:50.242611 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" event={"ID":"d070a7de-82e2-4294-9f98-cc2f79b384a2","Type":"ContainerStarted","Data":"4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5"} Apr 17 17:01:50.258771 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:50.258728 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" podStartSLOduration=1.001053319 podStartE2EDuration="5.258716936s" podCreationTimestamp="2026-04-17 17:01:45 +0000 UTC" firstStartedPulling="2026-04-17 17:01:45.564041238 +0000 UTC m=+544.963469752" lastFinishedPulling="2026-04-17 17:01:49.821704854 +0000 UTC m=+549.221133369" observedRunningTime="2026-04-17 17:01:50.257648813 +0000 UTC m=+549.657077344" watchObservedRunningTime="2026-04-17 17:01:50.258716936 +0000 UTC m=+549.658145468" Apr 17 17:01:50.270889 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:50.270850 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" podStartSLOduration=1.084003301 podStartE2EDuration="4.270838963s" podCreationTimestamp="2026-04-17 17:01:46 +0000 UTC" firstStartedPulling="2026-04-17 17:01:46.580527231 +0000 UTC m=+545.979955757" lastFinishedPulling="2026-04-17 17:01:49.767362892 +0000 UTC m=+549.166791419" observedRunningTime="2026-04-17 17:01:50.270571954 +0000 UTC m=+549.670000487" watchObservedRunningTime="2026-04-17 17:01:50.270838963 +0000 UTC m=+549.670267497" Apr 17 17:01:50.603533 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:50.603491 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-25jhs"] Apr 17 17:01:52.249252 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:52.249213 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" podUID="d070a7de-82e2-4294-9f98-cc2f79b384a2" containerName="authorino" containerID="cri-o://4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5" gracePeriod=30 Apr 17 17:01:52.494205 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:52.494181 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" Apr 17 17:01:52.617650 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:52.617608 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnn2l\" (UniqueName: \"kubernetes.io/projected/d070a7de-82e2-4294-9f98-cc2f79b384a2-kube-api-access-nnn2l\") pod \"d070a7de-82e2-4294-9f98-cc2f79b384a2\" (UID: \"d070a7de-82e2-4294-9f98-cc2f79b384a2\") " Apr 17 17:01:52.619707 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:52.619687 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d070a7de-82e2-4294-9f98-cc2f79b384a2-kube-api-access-nnn2l" (OuterVolumeSpecName: "kube-api-access-nnn2l") pod "d070a7de-82e2-4294-9f98-cc2f79b384a2" (UID: "d070a7de-82e2-4294-9f98-cc2f79b384a2"). InnerVolumeSpecName "kube-api-access-nnn2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:01:52.718450 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:52.718419 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nnn2l\" (UniqueName: \"kubernetes.io/projected/d070a7de-82e2-4294-9f98-cc2f79b384a2-kube-api-access-nnn2l\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:01:53.254038 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.254005 2574 generic.go:358] "Generic (PLEG): container finished" podID="d070a7de-82e2-4294-9f98-cc2f79b384a2" containerID="4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5" exitCode=0 Apr 17 17:01:53.254431 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.254063 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" Apr 17 17:01:53.254431 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.254094 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" event={"ID":"d070a7de-82e2-4294-9f98-cc2f79b384a2","Type":"ContainerDied","Data":"4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5"} Apr 17 17:01:53.254431 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.254135 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-25jhs" event={"ID":"d070a7de-82e2-4294-9f98-cc2f79b384a2","Type":"ContainerDied","Data":"3a824a5556cf9590e81a63a01ba7aaafaf68918b09b61ff817513361098ce720"} Apr 17 17:01:53.254431 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.254150 2574 scope.go:117] "RemoveContainer" containerID="4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5" Apr 17 17:01:53.262430 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.262416 2574 scope.go:117] "RemoveContainer" containerID="4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5" Apr 17 17:01:53.262681 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:01:53.262640 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5\": container with ID starting with 4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5 not found: ID does not exist" containerID="4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5" Apr 17 17:01:53.262729 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.262690 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5"} err="failed to get container status \"4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5\": rpc error: code = NotFound desc = could not find container \"4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5\": container with ID starting with 4fce35e3d09cff5299d0ea533793345b54bc0513fd5edf33e0a41a582b5d21b5 not found: ID does not exist" Apr 17 17:01:53.271028 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.271006 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-25jhs"] Apr 17 17:01:53.274925 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:53.274905 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-25jhs"] Apr 17 17:01:55.167944 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:01:55.167915 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d070a7de-82e2-4294-9f98-cc2f79b384a2" path="/var/lib/kubelet/pods/d070a7de-82e2-4294-9f98-cc2f79b384a2/volumes" Apr 17 17:02:01.247616 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:01.247585 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:02:01.602915 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:01.602871 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-z949k"] Apr 17 17:02:01.603182 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:01.603156 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" podUID="ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a" containerName="limitador" containerID="cri-o://a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35" gracePeriod=30 Apr 17 17:02:02.148834 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.148812 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:02:02.288334 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.288303 2574 generic.go:358] "Generic (PLEG): container finished" podID="ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a" containerID="a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35" exitCode=0 Apr 17 17:02:02.288772 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.288379 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" Apr 17 17:02:02.288772 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.288388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" event={"ID":"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a","Type":"ContainerDied","Data":"a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35"} Apr 17 17:02:02.288772 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.288428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-z949k" event={"ID":"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a","Type":"ContainerDied","Data":"0edd2960b831fc6714ac7fd3ebc93b60d38d70bfbf146c8bb76585660471ded5"} Apr 17 17:02:02.288772 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.288448 2574 scope.go:117] "RemoveContainer" containerID="a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35" Apr 17 17:02:02.297149 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.297115 2574 scope.go:117] "RemoveContainer" containerID="a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35" Apr 17 17:02:02.297415 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:02:02.297393 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35\": container with ID starting with a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35 not found: ID does not exist" containerID="a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35" Apr 17 17:02:02.297481 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.297428 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35"} err="failed to get container status \"a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35\": rpc error: code = NotFound desc = could not find container \"a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35\": container with ID starting with a0fdb065b5c0a47e5af5379352acf946fe10df79a99de58842b096dbee3e2a35 not found: ID does not exist" Apr 17 17:02:02.297581 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.297569 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-config-file\") pod \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\" (UID: \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\") " Apr 17 17:02:02.297711 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.297695 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj54s\" (UniqueName: \"kubernetes.io/projected/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-kube-api-access-zj54s\") pod \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\" (UID: \"ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a\") " Apr 17 17:02:02.297974 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.297956 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-config-file" (OuterVolumeSpecName: "config-file") pod "ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a" (UID: "ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:02:02.299778 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.299759 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-kube-api-access-zj54s" (OuterVolumeSpecName: "kube-api-access-zj54s") pod "ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a" (UID: "ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a"). InnerVolumeSpecName "kube-api-access-zj54s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:02:02.398616 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.398583 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zj54s\" (UniqueName: \"kubernetes.io/projected/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-kube-api-access-zj54s\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:02:02.398616 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.398611 2574 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a-config-file\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:02:02.610415 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.610389 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-z949k"] Apr 17 17:02:02.618739 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:02.613868 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-z949k"] Apr 17 17:02:03.167064 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:03.167024 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a" path="/var/lib/kubelet/pods/ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a/volumes" Apr 17 17:02:05.385453 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.385418 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-wdgkk"] Apr 17 17:02:05.385980 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.385966 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d070a7de-82e2-4294-9f98-cc2f79b384a2" containerName="authorino" Apr 17 17:02:05.386052 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.385985 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d070a7de-82e2-4294-9f98-cc2f79b384a2" containerName="authorino" Apr 17 17:02:05.386052 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.386015 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a" containerName="limitador" Apr 17 17:02:05.386052 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.386026 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a" containerName="limitador" Apr 17 17:02:05.386218 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.386122 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d070a7de-82e2-4294-9f98-cc2f79b384a2" containerName="authorino" Apr 17 17:02:05.386218 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.386139 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff5dec6e-8ac9-4494-b3ec-44716ca1bf5a" containerName="limitador" Apr 17 17:02:05.390702 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.390679 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:05.393313 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.393289 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 17:02:05.393435 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.393293 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-vwv9b\"" Apr 17 17:02:05.397675 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.397629 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-wdgkk"] Apr 17 17:02:05.524189 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.524155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/87b96799-3d10-4c24-9545-b2c210461496-data\") pod \"postgres-868db5846d-wdgkk\" (UID: \"87b96799-3d10-4c24-9545-b2c210461496\") " pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:05.524355 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.524233 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmrw\" (UniqueName: \"kubernetes.io/projected/87b96799-3d10-4c24-9545-b2c210461496-kube-api-access-pvmrw\") pod \"postgres-868db5846d-wdgkk\" (UID: \"87b96799-3d10-4c24-9545-b2c210461496\") " pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:05.625566 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.625531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmrw\" (UniqueName: \"kubernetes.io/projected/87b96799-3d10-4c24-9545-b2c210461496-kube-api-access-pvmrw\") pod \"postgres-868db5846d-wdgkk\" (UID: \"87b96799-3d10-4c24-9545-b2c210461496\") " pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:05.625773 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.625583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/87b96799-3d10-4c24-9545-b2c210461496-data\") pod \"postgres-868db5846d-wdgkk\" (UID: \"87b96799-3d10-4c24-9545-b2c210461496\") " pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:05.625950 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.625932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/87b96799-3d10-4c24-9545-b2c210461496-data\") pod \"postgres-868db5846d-wdgkk\" (UID: \"87b96799-3d10-4c24-9545-b2c210461496\") " pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:05.634558 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.634533 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmrw\" (UniqueName: \"kubernetes.io/projected/87b96799-3d10-4c24-9545-b2c210461496-kube-api-access-pvmrw\") pod \"postgres-868db5846d-wdgkk\" (UID: \"87b96799-3d10-4c24-9545-b2c210461496\") " pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:05.705111 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.705025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:05.832511 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:05.832487 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-wdgkk"] Apr 17 17:02:05.834143 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:02:05.834111 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b96799_3d10_4c24_9545_b2c210461496.slice/crio-b99d743160ab71d51d595b3a668eb77c69403ff42d3a9cabb282b94aa5c36a77 WatchSource:0}: Error finding container b99d743160ab71d51d595b3a668eb77c69403ff42d3a9cabb282b94aa5c36a77: Status 404 returned error can't find the container with id b99d743160ab71d51d595b3a668eb77c69403ff42d3a9cabb282b94aa5c36a77 Apr 17 17:02:06.304381 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:06.304343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-wdgkk" event={"ID":"87b96799-3d10-4c24-9545-b2c210461496","Type":"ContainerStarted","Data":"b99d743160ab71d51d595b3a668eb77c69403ff42d3a9cabb282b94aa5c36a77"} Apr 17 17:02:12.332173 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:12.332137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-wdgkk" event={"ID":"87b96799-3d10-4c24-9545-b2c210461496","Type":"ContainerStarted","Data":"f6de75e4b1f1f9f51d4bff287e3adbf29ae1c96e5f58d3330c8ea2e326b40e3c"} Apr 17 17:02:12.332518 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:12.332234 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:12.352895 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:12.352853 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-wdgkk" podStartSLOduration=1.552846824 podStartE2EDuration="7.352840374s" podCreationTimestamp="2026-04-17 17:02:05 +0000 UTC" firstStartedPulling="2026-04-17 17:02:05.835600596 +0000 UTC m=+565.235029111" lastFinishedPulling="2026-04-17 17:02:11.635594134 +0000 UTC m=+571.035022661" observedRunningTime="2026-04-17 17:02:12.351094856 +0000 UTC m=+571.750523391" watchObservedRunningTime="2026-04-17 17:02:12.352840374 +0000 UTC m=+571.752268905" Apr 17 17:02:18.364765 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:18.364740 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-wdgkk" Apr 17 17:02:19.264773 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.264744 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-67cb499598-nqncc"] Apr 17 17:02:19.270270 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.270252 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:19.273031 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.273011 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 17:02:19.273154 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.273137 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 17:02:19.273202 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.273168 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-gmxlg\"" Apr 17 17:02:19.277047 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.277025 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-67cb499598-nqncc"] Apr 17 17:02:19.450497 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.450464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls\") pod \"maas-api-67cb499598-nqncc\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:19.450903 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.450509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrjr4\" (UniqueName: \"kubernetes.io/projected/e075b927-0aec-4e18-a1c2-507b8ce93940-kube-api-access-lrjr4\") pod \"maas-api-67cb499598-nqncc\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:19.551697 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.551637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrjr4\" (UniqueName: \"kubernetes.io/projected/e075b927-0aec-4e18-a1c2-507b8ce93940-kube-api-access-lrjr4\") pod \"maas-api-67cb499598-nqncc\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:19.551892 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.551787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls\") pod \"maas-api-67cb499598-nqncc\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:19.551968 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:02:19.551911 2574 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 17 17:02:19.552025 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:02:19.551991 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls podName:e075b927-0aec-4e18-a1c2-507b8ce93940 nodeName:}" failed. No retries permitted until 2026-04-17 17:02:20.051969201 +0000 UTC m=+579.451397714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls") pod "maas-api-67cb499598-nqncc" (UID: "e075b927-0aec-4e18-a1c2-507b8ce93940") : secret "maas-api-serving-cert" not found Apr 17 17:02:19.562727 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:19.562702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrjr4\" (UniqueName: \"kubernetes.io/projected/e075b927-0aec-4e18-a1c2-507b8ce93940-kube-api-access-lrjr4\") pod \"maas-api-67cb499598-nqncc\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:20.032361 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.032327 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-sx8sm"] Apr 17 17:02:20.035984 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.035968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" Apr 17 17:02:20.038468 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.038450 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wrrmf\"" Apr 17 17:02:20.041995 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.041972 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-sx8sm"] Apr 17 17:02:20.056933 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.056907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls\") pod \"maas-api-67cb499598-nqncc\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:20.059458 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.059430 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls\") pod \"maas-api-67cb499598-nqncc\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:20.158311 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.158280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/1ffa56e7-2287-469e-aa81-f1251789f946-kube-api-access-gqbgk\") pod \"authorino-8b475cf9f-sx8sm\" (UID: \"1ffa56e7-2287-469e-aa81-f1251789f946\") " pod="kuadrant-system/authorino-8b475cf9f-sx8sm" Apr 17 17:02:20.181437 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.181404 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:20.259090 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.259060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/1ffa56e7-2287-469e-aa81-f1251789f946-kube-api-access-gqbgk\") pod \"authorino-8b475cf9f-sx8sm\" (UID: \"1ffa56e7-2287-469e-aa81-f1251789f946\") " pod="kuadrant-system/authorino-8b475cf9f-sx8sm" Apr 17 17:02:20.271036 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.271011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/1ffa56e7-2287-469e-aa81-f1251789f946-kube-api-access-gqbgk\") pod \"authorino-8b475cf9f-sx8sm\" (UID: \"1ffa56e7-2287-469e-aa81-f1251789f946\") " pod="kuadrant-system/authorino-8b475cf9f-sx8sm" Apr 17 17:02:20.277975 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.277946 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-sx8sm"] Apr 17 17:02:20.278258 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.278245 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" Apr 17 17:02:20.304018 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.303769 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6b54f4cbcb-m5tb6"] Apr 17 17:02:20.312137 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.312110 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6b54f4cbcb-m5tb6" Apr 17 17:02:20.314602 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.314523 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6b54f4cbcb-m5tb6"] Apr 17 17:02:20.318430 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.318335 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-67cb499598-nqncc"] Apr 17 17:02:20.322557 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:02:20.322534 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode075b927_0aec_4e18_a1c2_507b8ce93940.slice/crio-746b24bb2bec4775d63f900af6ca8831e662c83949e96f40774e8ceda58ee762 WatchSource:0}: Error finding container 746b24bb2bec4775d63f900af6ca8831e662c83949e96f40774e8ceda58ee762: Status 404 returned error can't find the container with id 746b24bb2bec4775d63f900af6ca8831e662c83949e96f40774e8ceda58ee762 Apr 17 17:02:20.364891 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.364843 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-67cb499598-nqncc" event={"ID":"e075b927-0aec-4e18-a1c2-507b8ce93940","Type":"ContainerStarted","Data":"746b24bb2bec4775d63f900af6ca8831e662c83949e96f40774e8ceda58ee762"} Apr 17 17:02:20.382098 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.381797 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6b54f4cbcb-m5tb6"] Apr 17 17:02:20.382218 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:02:20.382108 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-kk4k5], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-6b54f4cbcb-m5tb6" podUID="5dfde5e4-dd7d-4795-af1d-a7c0047d1133" Apr 17 17:02:20.422962 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.422929 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-sx8sm"] Apr 17 17:02:20.424372 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:02:20.424344 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ffa56e7_2287_469e_aa81_f1251789f946.slice/crio-4315d5932fdf22c4431df248386a2bf61deebf1ff61ed40fba89cb33c6a78d67 WatchSource:0}: Error finding container 4315d5932fdf22c4431df248386a2bf61deebf1ff61ed40fba89cb33c6a78d67: Status 404 returned error can't find the container with id 4315d5932fdf22c4431df248386a2bf61deebf1ff61ed40fba89cb33c6a78d67 Apr 17 17:02:20.461617 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.461592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk4k5\" (UniqueName: \"kubernetes.io/projected/5dfde5e4-dd7d-4795-af1d-a7c0047d1133-kube-api-access-kk4k5\") pod \"authorino-6b54f4cbcb-m5tb6\" (UID: \"5dfde5e4-dd7d-4795-af1d-a7c0047d1133\") " pod="kuadrant-system/authorino-6b54f4cbcb-m5tb6" Apr 17 17:02:20.562755 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.562644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kk4k5\" (UniqueName: \"kubernetes.io/projected/5dfde5e4-dd7d-4795-af1d-a7c0047d1133-kube-api-access-kk4k5\") pod \"authorino-6b54f4cbcb-m5tb6\" (UID: \"5dfde5e4-dd7d-4795-af1d-a7c0047d1133\") " pod="kuadrant-system/authorino-6b54f4cbcb-m5tb6" Apr 17 17:02:20.571417 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:20.571393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk4k5\" (UniqueName: \"kubernetes.io/projected/5dfde5e4-dd7d-4795-af1d-a7c0047d1133-kube-api-access-kk4k5\") pod \"authorino-6b54f4cbcb-m5tb6\" (UID: \"5dfde5e4-dd7d-4795-af1d-a7c0047d1133\") " pod="kuadrant-system/authorino-6b54f4cbcb-m5tb6" Apr 17 17:02:21.372327 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.372238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6b54f4cbcb-m5tb6" Apr 17 17:02:21.372504 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.372324 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" podUID="1ffa56e7-2287-469e-aa81-f1251789f946" containerName="authorino" containerID="cri-o://875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3" gracePeriod=30 Apr 17 17:02:21.372504 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.372371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" event={"ID":"1ffa56e7-2287-469e-aa81-f1251789f946","Type":"ContainerStarted","Data":"875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3"} Apr 17 17:02:21.372504 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.372414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" event={"ID":"1ffa56e7-2287-469e-aa81-f1251789f946","Type":"ContainerStarted","Data":"4315d5932fdf22c4431df248386a2bf61deebf1ff61ed40fba89cb33c6a78d67"} Apr 17 17:02:21.379281 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.379169 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6b54f4cbcb-m5tb6" Apr 17 17:02:21.392887 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.392836 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" podStartSLOduration=0.767388512 podStartE2EDuration="1.392818637s" podCreationTimestamp="2026-04-17 17:02:20 +0000 UTC" firstStartedPulling="2026-04-17 17:02:20.426194756 +0000 UTC m=+579.825623266" lastFinishedPulling="2026-04-17 17:02:21.051624864 +0000 UTC m=+580.451053391" observedRunningTime="2026-04-17 17:02:21.389585153 +0000 UTC m=+580.789013695" watchObservedRunningTime="2026-04-17 17:02:21.392818637 +0000 UTC m=+580.792247170" Apr 17 17:02:21.570080 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.570053 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk4k5\" (UniqueName: \"kubernetes.io/projected/5dfde5e4-dd7d-4795-af1d-a7c0047d1133-kube-api-access-kk4k5\") pod \"5dfde5e4-dd7d-4795-af1d-a7c0047d1133\" (UID: \"5dfde5e4-dd7d-4795-af1d-a7c0047d1133\") " Apr 17 17:02:21.574087 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.574061 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dfde5e4-dd7d-4795-af1d-a7c0047d1133-kube-api-access-kk4k5" (OuterVolumeSpecName: "kube-api-access-kk4k5") pod "5dfde5e4-dd7d-4795-af1d-a7c0047d1133" (UID: "5dfde5e4-dd7d-4795-af1d-a7c0047d1133"). InnerVolumeSpecName "kube-api-access-kk4k5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:02:21.659964 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.659943 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" Apr 17 17:02:21.671666 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.671630 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kk4k5\" (UniqueName: \"kubernetes.io/projected/5dfde5e4-dd7d-4795-af1d-a7c0047d1133-kube-api-access-kk4k5\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:02:21.772366 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.772330 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/1ffa56e7-2287-469e-aa81-f1251789f946-kube-api-access-gqbgk\") pod \"1ffa56e7-2287-469e-aa81-f1251789f946\" (UID: \"1ffa56e7-2287-469e-aa81-f1251789f946\") " Apr 17 17:02:21.774326 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.774293 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ffa56e7-2287-469e-aa81-f1251789f946-kube-api-access-gqbgk" (OuterVolumeSpecName: "kube-api-access-gqbgk") pod "1ffa56e7-2287-469e-aa81-f1251789f946" (UID: "1ffa56e7-2287-469e-aa81-f1251789f946"). InnerVolumeSpecName "kube-api-access-gqbgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:02:21.874226 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:21.874174 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqbgk\" (UniqueName: \"kubernetes.io/projected/1ffa56e7-2287-469e-aa81-f1251789f946-kube-api-access-gqbgk\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:02:22.381290 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.381221 2574 generic.go:358] "Generic (PLEG): container finished" podID="1ffa56e7-2287-469e-aa81-f1251789f946" containerID="875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3" exitCode=0 Apr 17 17:02:22.381486 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.381424 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6b54f4cbcb-m5tb6" Apr 17 17:02:22.381551 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.381498 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" event={"ID":"1ffa56e7-2287-469e-aa81-f1251789f946","Type":"ContainerDied","Data":"875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3"} Apr 17 17:02:22.381606 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.381571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" event={"ID":"1ffa56e7-2287-469e-aa81-f1251789f946","Type":"ContainerDied","Data":"4315d5932fdf22c4431df248386a2bf61deebf1ff61ed40fba89cb33c6a78d67"} Apr 17 17:02:22.381606 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.381595 2574 scope.go:117] "RemoveContainer" containerID="875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3" Apr 17 17:02:22.382882 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.381812 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-sx8sm" Apr 17 17:02:22.420544 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.420514 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6b54f4cbcb-m5tb6"] Apr 17 17:02:22.425683 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.425631 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6b54f4cbcb-m5tb6"] Apr 17 17:02:22.437054 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.437017 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-sx8sm"] Apr 17 17:02:22.439283 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:22.439258 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-sx8sm"] Apr 17 17:02:23.089516 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:23.089494 2574 scope.go:117] "RemoveContainer" containerID="875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3" Apr 17 17:02:23.089882 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:02:23.089829 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3\": container with ID starting with 875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3 not found: ID does not exist" containerID="875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3" Apr 17 17:02:23.089939 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:23.089877 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3"} err="failed to get container status \"875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3\": rpc error: code = NotFound desc = could not find container \"875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3\": container with ID starting with 875c289857ab2a0882409c6e7ca68291a9775862cb015d5debb76333f7335df3 not found: ID does not exist" Apr 17 17:02:23.167750 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:23.167717 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ffa56e7-2287-469e-aa81-f1251789f946" path="/var/lib/kubelet/pods/1ffa56e7-2287-469e-aa81-f1251789f946/volumes" Apr 17 17:02:23.168103 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:23.168090 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dfde5e4-dd7d-4795-af1d-a7c0047d1133" path="/var/lib/kubelet/pods/5dfde5e4-dd7d-4795-af1d-a7c0047d1133/volumes" Apr 17 17:02:24.391808 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:24.391777 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-67cb499598-nqncc" event={"ID":"e075b927-0aec-4e18-a1c2-507b8ce93940","Type":"ContainerStarted","Data":"1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9"} Apr 17 17:02:24.392216 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:24.391932 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:24.407810 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:24.407761 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-67cb499598-nqncc" podStartSLOduration=2.322299178 podStartE2EDuration="5.407748235s" podCreationTimestamp="2026-04-17 17:02:19 +0000 UTC" firstStartedPulling="2026-04-17 17:02:20.323989142 +0000 UTC m=+579.723417652" lastFinishedPulling="2026-04-17 17:02:23.409438194 +0000 UTC m=+582.808866709" observedRunningTime="2026-04-17 17:02:24.406459747 +0000 UTC m=+583.805888282" watchObservedRunningTime="2026-04-17 17:02:24.407748235 +0000 UTC m=+583.807176766" Apr 17 17:02:29.517673 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.517631 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-67cb499598-nqncc"] Apr 17 17:02:29.518136 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.517877 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-67cb499598-nqncc" podUID="e075b927-0aec-4e18-a1c2-507b8ce93940" containerName="maas-api" containerID="cri-o://1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9" gracePeriod=30 Apr 17 17:02:29.522587 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.522566 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:29.757850 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.757820 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:29.836492 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.836464 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls\") pod \"e075b927-0aec-4e18-a1c2-507b8ce93940\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " Apr 17 17:02:29.836705 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.836521 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrjr4\" (UniqueName: \"kubernetes.io/projected/e075b927-0aec-4e18-a1c2-507b8ce93940-kube-api-access-lrjr4\") pod \"e075b927-0aec-4e18-a1c2-507b8ce93940\" (UID: \"e075b927-0aec-4e18-a1c2-507b8ce93940\") " Apr 17 17:02:29.838959 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.838929 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e075b927-0aec-4e18-a1c2-507b8ce93940-kube-api-access-lrjr4" (OuterVolumeSpecName: "kube-api-access-lrjr4") pod "e075b927-0aec-4e18-a1c2-507b8ce93940" (UID: "e075b927-0aec-4e18-a1c2-507b8ce93940"). InnerVolumeSpecName "kube-api-access-lrjr4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:02:29.838959 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.838948 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "e075b927-0aec-4e18-a1c2-507b8ce93940" (UID: "e075b927-0aec-4e18-a1c2-507b8ce93940"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:02:29.937447 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.937404 2574 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e075b927-0aec-4e18-a1c2-507b8ce93940-maas-api-tls\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:02:29.937447 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:29.937444 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrjr4\" (UniqueName: \"kubernetes.io/projected/e075b927-0aec-4e18-a1c2-507b8ce93940-kube-api-access-lrjr4\") on node \"ip-10-0-138-47.ec2.internal\" DevicePath \"\"" Apr 17 17:02:30.416101 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.416061 2574 generic.go:358] "Generic (PLEG): container finished" podID="e075b927-0aec-4e18-a1c2-507b8ce93940" containerID="1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9" exitCode=0 Apr 17 17:02:30.416248 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.416126 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-67cb499598-nqncc" Apr 17 17:02:30.416248 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.416145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-67cb499598-nqncc" event={"ID":"e075b927-0aec-4e18-a1c2-507b8ce93940","Type":"ContainerDied","Data":"1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9"} Apr 17 17:02:30.416248 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.416184 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-67cb499598-nqncc" event={"ID":"e075b927-0aec-4e18-a1c2-507b8ce93940","Type":"ContainerDied","Data":"746b24bb2bec4775d63f900af6ca8831e662c83949e96f40774e8ceda58ee762"} Apr 17 17:02:30.416248 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.416204 2574 scope.go:117] "RemoveContainer" containerID="1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9" Apr 17 17:02:30.424889 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.424872 2574 scope.go:117] "RemoveContainer" containerID="1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9" Apr 17 17:02:30.425134 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:02:30.425117 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9\": container with ID starting with 1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9 not found: ID does not exist" containerID="1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9" Apr 17 17:02:30.425184 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.425142 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9"} err="failed to get container status \"1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9\": rpc error: code = NotFound desc = could not find container \"1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9\": container with ID starting with 1406196fbecb6bafcf25781e3ec1774a027e67af6c656516955f021ce1962da9 not found: ID does not exist" Apr 17 17:02:30.438158 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.438138 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-67cb499598-nqncc"] Apr 17 17:02:30.446392 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:30.444760 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-67cb499598-nqncc"] Apr 17 17:02:31.169485 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:31.169452 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e075b927-0aec-4e18-a1c2-507b8ce93940" path="/var/lib/kubelet/pods/e075b927-0aec-4e18-a1c2-507b8ce93940/volumes" Apr 17 17:02:41.065749 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:41.065717 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 17:02:41.066168 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:41.066075 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 17:02:52.980485 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.980449 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc"] Apr 17 17:02:52.982460 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.980814 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e075b927-0aec-4e18-a1c2-507b8ce93940" containerName="maas-api" Apr 17 17:02:52.982460 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.980826 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e075b927-0aec-4e18-a1c2-507b8ce93940" containerName="maas-api" Apr 17 17:02:52.982460 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.980841 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ffa56e7-2287-469e-aa81-f1251789f946" containerName="authorino" Apr 17 17:02:52.982460 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.980846 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ffa56e7-2287-469e-aa81-f1251789f946" containerName="authorino" Apr 17 17:02:52.982460 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.980901 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ffa56e7-2287-469e-aa81-f1251789f946" containerName="authorino" Apr 17 17:02:52.982460 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.980910 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e075b927-0aec-4e18-a1c2-507b8ce93940" containerName="maas-api" Apr 17 17:02:52.983773 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.983758 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:52.986235 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.986212 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 17:02:52.986235 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.986225 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-9bqcd\"" Apr 17 17:02:52.987411 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.987381 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 17:02:52.987538 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.987461 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 17:02:52.992935 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:52.992916 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc"] Apr 17 17:02:53.033466 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.033445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.033585 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.033489 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.033585 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.033543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.033585 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.033578 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.033720 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.033623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvmg\" (UniqueName: \"kubernetes.io/projected/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-kube-api-access-clvmg\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.033720 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.033679 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134318 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134469 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134469 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134553 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134553 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134528 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clvmg\" (UniqueName: \"kubernetes.io/projected/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-kube-api-access-clvmg\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134683 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134748 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134802 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134747 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.134941 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.134916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.136788 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.136763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.137064 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.137043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.142228 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.142211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvmg\" (UniqueName: \"kubernetes.io/projected/e32ed82c-3f19-45c3-bc61-85aa7c599cf6-kube-api-access-clvmg\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc\" (UID: \"e32ed82c-3f19-45c3-bc61-85aa7c599cf6\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.294719 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.294685 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:02:53.426223 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.426199 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc"] Apr 17 17:02:53.428475 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:02:53.428435 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32ed82c_3f19_45c3_bc61_85aa7c599cf6.slice/crio-3e208530a15c18354a378932adf8857264260d35f9a4572e0bf46d473eed9d63 WatchSource:0}: Error finding container 3e208530a15c18354a378932adf8857264260d35f9a4572e0bf46d473eed9d63: Status 404 returned error can't find the container with id 3e208530a15c18354a378932adf8857264260d35f9a4572e0bf46d473eed9d63 Apr 17 17:02:53.430231 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.430215 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:02:53.500449 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:02:53.500413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerStarted","Data":"3e208530a15c18354a378932adf8857264260d35f9a4572e0bf46d473eed9d63"} Apr 17 17:03:00.528880 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:00.528846 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerStarted","Data":"892e801ecf9783119366a6e5a63bdb07e2c1c8745c527b87379a2213cc7fc383"} Apr 17 17:03:06.554773 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.554737 2574 generic.go:358] "Generic (PLEG): container finished" podID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" containerID="892e801ecf9783119366a6e5a63bdb07e2c1c8745c527b87379a2213cc7fc383" exitCode=0 Apr 17 17:03:06.555177 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.554795 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerDied","Data":"892e801ecf9783119366a6e5a63bdb07e2c1c8745c527b87379a2213cc7fc383"} Apr 17 17:03:06.777732 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.777700 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b"] Apr 17 17:03:06.781539 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.781520 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.783973 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.783952 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 17:03:06.791735 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.791711 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b"] Apr 17 17:03:06.859540 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.859089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.859540 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.859150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.859540 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.859183 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.859540 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.859231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66s67\" (UniqueName: \"kubernetes.io/projected/35ae9aa3-b5d8-4aad-955f-426d1900adeb-kube-api-access-66s67\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.859540 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.859290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35ae9aa3-b5d8-4aad-955f-426d1900adeb-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.859540 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.859333 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.960619 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.960580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35ae9aa3-b5d8-4aad-955f-426d1900adeb-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.960812 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.960643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.960812 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.960742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.960812 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.960784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.960989 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.960813 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.960989 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.960860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66s67\" (UniqueName: \"kubernetes.io/projected/35ae9aa3-b5d8-4aad-955f-426d1900adeb-kube-api-access-66s67\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.961181 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.961135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.961525 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.961492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.961778 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.961627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.963328 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.963296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/35ae9aa3-b5d8-4aad-955f-426d1900adeb-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.963749 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.963721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/35ae9aa3-b5d8-4aad-955f-426d1900adeb-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:06.970054 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:06.970030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66s67\" (UniqueName: \"kubernetes.io/projected/35ae9aa3-b5d8-4aad-955f-426d1900adeb-kube-api-access-66s67\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-fdg9b\" (UID: \"35ae9aa3-b5d8-4aad-955f-426d1900adeb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:07.095232 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:07.095196 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:07.323074 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:07.322995 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b"] Apr 17 17:03:07.633075 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:03:07.633042 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35ae9aa3_b5d8_4aad_955f_426d1900adeb.slice/crio-41c6f42cd308be43dc97de19d728b22eb7f37be1cb580a671b4e3a9cbffb2d47 WatchSource:0}: Error finding container 41c6f42cd308be43dc97de19d728b22eb7f37be1cb580a671b4e3a9cbffb2d47: Status 404 returned error can't find the container with id 41c6f42cd308be43dc97de19d728b22eb7f37be1cb580a671b4e3a9cbffb2d47 Apr 17 17:03:08.566636 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:08.566580 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" event={"ID":"35ae9aa3-b5d8-4aad-955f-426d1900adeb","Type":"ContainerStarted","Data":"704173a9285fdc185099febb253493a56d39dcd9475ac85048b591ebe45e3e0d"} Apr 17 17:03:08.566636 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:08.566637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" event={"ID":"35ae9aa3-b5d8-4aad-955f-426d1900adeb","Type":"ContainerStarted","Data":"41c6f42cd308be43dc97de19d728b22eb7f37be1cb580a671b4e3a9cbffb2d47"} Apr 17 17:03:08.568436 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:08.568413 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/0.log" Apr 17 17:03:08.568841 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:08.568814 2574 generic.go:358] "Generic (PLEG): container finished" podID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" containerID="016eeabd4068c726998067cc200380d9b47120dde584958e8f71cb7d892cdd0e" exitCode=2 Apr 17 17:03:08.568920 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:08.568886 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerDied","Data":"016eeabd4068c726998067cc200380d9b47120dde584958e8f71cb7d892cdd0e"} Apr 17 17:03:08.569242 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:08.569230 2574 scope.go:117] "RemoveContainer" containerID="016eeabd4068c726998067cc200380d9b47120dde584958e8f71cb7d892cdd0e" Apr 17 17:03:09.574678 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:09.574636 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/1.log" Apr 17 17:03:09.575085 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:09.575069 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/0.log" Apr 17 17:03:09.575462 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:09.575435 2574 generic.go:358] "Generic (PLEG): container finished" podID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" containerID="2a885698f171f97a47451550849612ef6dec8c6e2bc2fba737564f34b6a32499" exitCode=2 Apr 17 17:03:09.575583 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:09.575515 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerDied","Data":"2a885698f171f97a47451550849612ef6dec8c6e2bc2fba737564f34b6a32499"} Apr 17 17:03:09.575583 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:09.575562 2574 scope.go:117] "RemoveContainer" containerID="016eeabd4068c726998067cc200380d9b47120dde584958e8f71cb7d892cdd0e" Apr 17 17:03:09.576254 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:09.576168 2574 scope.go:117] "RemoveContainer" containerID="2a885698f171f97a47451550849612ef6dec8c6e2bc2fba737564f34b6a32499" Apr 17 17:03:09.576405 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:09.576384 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:03:10.582562 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:10.582537 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/1.log" Apr 17 17:03:13.295769 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:13.295741 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:03:13.296179 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:13.295781 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:03:13.296246 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:13.296199 2574 scope.go:117] "RemoveContainer" containerID="2a885698f171f97a47451550849612ef6dec8c6e2bc2fba737564f34b6a32499" Apr 17 17:03:13.296413 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:13.296389 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:03:13.595069 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:13.594989 2574 generic.go:358] "Generic (PLEG): container finished" podID="35ae9aa3-b5d8-4aad-955f-426d1900adeb" containerID="704173a9285fdc185099febb253493a56d39dcd9475ac85048b591ebe45e3e0d" exitCode=0 Apr 17 17:03:13.595206 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:13.595066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" event={"ID":"35ae9aa3-b5d8-4aad-955f-426d1900adeb","Type":"ContainerDied","Data":"704173a9285fdc185099febb253493a56d39dcd9475ac85048b591ebe45e3e0d"} Apr 17 17:03:17.614857 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:17.614820 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" event={"ID":"35ae9aa3-b5d8-4aad-955f-426d1900adeb","Type":"ContainerStarted","Data":"2226b7841b09f20377f4959ebf43fae33fe70944cc9a41b12bb2ffcc2cd52954"} Apr 17 17:03:17.615341 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:17.615054 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:17.633376 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:17.633335 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" podStartSLOduration=7.985191152 podStartE2EDuration="11.633322824s" podCreationTimestamp="2026-04-17 17:03:06 +0000 UTC" firstStartedPulling="2026-04-17 17:03:13.595741649 +0000 UTC m=+632.995170160" lastFinishedPulling="2026-04-17 17:03:17.243873319 +0000 UTC m=+636.643301832" observedRunningTime="2026-04-17 17:03:17.631908673 +0000 UTC m=+637.031337219" watchObservedRunningTime="2026-04-17 17:03:17.633322824 +0000 UTC m=+637.032751400" Apr 17 17:03:25.162974 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:25.162942 2574 scope.go:117] "RemoveContainer" containerID="2a885698f171f97a47451550849612ef6dec8c6e2bc2fba737564f34b6a32499" Apr 17 17:03:25.645580 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:25.645552 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/2.log" Apr 17 17:03:25.645933 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:25.645915 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/1.log" Apr 17 17:03:25.646274 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:25.646255 2574 generic.go:358] "Generic (PLEG): container finished" podID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" containerID="2ff77990cae9fd207bb3b03f76ade71699a79fcebf7a9b8066276da4198f66d3" exitCode=2 Apr 17 17:03:25.646350 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:25.646308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerDied","Data":"2ff77990cae9fd207bb3b03f76ade71699a79fcebf7a9b8066276da4198f66d3"} Apr 17 17:03:25.646409 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:25.646360 2574 scope.go:117] "RemoveContainer" containerID="2a885698f171f97a47451550849612ef6dec8c6e2bc2fba737564f34b6a32499" Apr 17 17:03:25.646766 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:25.646749 2574 scope.go:117] "RemoveContainer" containerID="2ff77990cae9fd207bb3b03f76ade71699a79fcebf7a9b8066276da4198f66d3" Apr 17 17:03:25.646948 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:25.646931 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:03:26.650834 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.650808 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/2.log" Apr 17 17:03:26.884816 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.884775 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk"] Apr 17 17:03:26.897135 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.897100 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:26.899444 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.899420 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk"] Apr 17 17:03:26.899867 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.899843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 17:03:26.932601 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.932574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8lt2\" (UniqueName: \"kubernetes.io/projected/948b4586-4164-44b7-9c00-bb3340af1142-kube-api-access-t8lt2\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:26.932776 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.932607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:26.932776 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.932639 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:26.932776 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.932690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/948b4586-4164-44b7-9c00-bb3340af1142-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:26.932776 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.932734 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:26.932912 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:26.932803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033234 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033193 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033401 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033401 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033277 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/948b4586-4164-44b7-9c00-bb3340af1142-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033401 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033401 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033641 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8lt2\" (UniqueName: \"kubernetes.io/projected/948b4586-4164-44b7-9c00-bb3340af1142-kube-api-access-t8lt2\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033747 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-model-cache\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033804 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.033869 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.033798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-home\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.035839 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.035814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/948b4586-4164-44b7-9c00-bb3340af1142-dshm\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.035988 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.035973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/948b4586-4164-44b7-9c00-bb3340af1142-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.041493 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.041471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8lt2\" (UniqueName: \"kubernetes.io/projected/948b4586-4164-44b7-9c00-bb3340af1142-kube-api-access-t8lt2\") pod \"facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk\" (UID: \"948b4586-4164-44b7-9c00-bb3340af1142\") " pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.208210 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.208127 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:27.341728 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.341696 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk"] Apr 17 17:03:27.344625 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:03:27.344595 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod948b4586_4164_44b7_9c00_bb3340af1142.slice/crio-cc8f5d2a6074e6f0b6701bea5556dd9a18be39b481c2c51310a3ef995f93b9f2 WatchSource:0}: Error finding container cc8f5d2a6074e6f0b6701bea5556dd9a18be39b481c2c51310a3ef995f93b9f2: Status 404 returned error can't find the container with id cc8f5d2a6074e6f0b6701bea5556dd9a18be39b481c2c51310a3ef995f93b9f2 Apr 17 17:03:27.656800 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.656755 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerStarted","Data":"7790acf215cdbbaff7c4b9c443a8e45bcf0d5fce7e05ccbd3d0917f122a7aa9a"} Apr 17 17:03:27.656800 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:27.656800 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerStarted","Data":"cc8f5d2a6074e6f0b6701bea5556dd9a18be39b481c2c51310a3ef995f93b9f2"} Apr 17 17:03:28.632180 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:28.632151 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-fdg9b" Apr 17 17:03:33.295757 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:33.295719 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:03:33.295757 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:33.295755 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:03:33.296216 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:33.296132 2574 scope.go:117] "RemoveContainer" containerID="2ff77990cae9fd207bb3b03f76ade71699a79fcebf7a9b8066276da4198f66d3" Apr 17 17:03:33.296316 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:33.296300 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:03:33.680698 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:33.680584 2574 generic.go:358] "Generic (PLEG): container finished" podID="948b4586-4164-44b7-9c00-bb3340af1142" containerID="7790acf215cdbbaff7c4b9c443a8e45bcf0d5fce7e05ccbd3d0917f122a7aa9a" exitCode=0 Apr 17 17:03:33.680849 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:33.680689 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerDied","Data":"7790acf215cdbbaff7c4b9c443a8e45bcf0d5fce7e05ccbd3d0917f122a7aa9a"} Apr 17 17:03:34.686573 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:34.686544 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/0.log" Apr 17 17:03:34.686988 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:34.686894 2574 generic.go:358] "Generic (PLEG): container finished" podID="948b4586-4164-44b7-9c00-bb3340af1142" containerID="9ea61a0123467235dba0407374688096ac199aa2c64b92f190bc46dd739456ca" exitCode=2 Apr 17 17:03:34.686988 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:34.686974 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerDied","Data":"9ea61a0123467235dba0407374688096ac199aa2c64b92f190bc46dd739456ca"} Apr 17 17:03:34.687360 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:34.687345 2574 scope.go:117] "RemoveContainer" containerID="9ea61a0123467235dba0407374688096ac199aa2c64b92f190bc46dd739456ca" Apr 17 17:03:35.692097 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:35.692071 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/1.log" Apr 17 17:03:35.692518 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:35.692468 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/0.log" Apr 17 17:03:35.692816 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:35.692795 2574 generic.go:358] "Generic (PLEG): container finished" podID="948b4586-4164-44b7-9c00-bb3340af1142" containerID="b17fca58ded9e7d845d444f07bda7b1775a10c159748e6a1f06e8b75c7469843" exitCode=2 Apr 17 17:03:35.692889 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:35.692863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerDied","Data":"b17fca58ded9e7d845d444f07bda7b1775a10c159748e6a1f06e8b75c7469843"} Apr 17 17:03:35.692945 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:35.692903 2574 scope.go:117] "RemoveContainer" containerID="9ea61a0123467235dba0407374688096ac199aa2c64b92f190bc46dd739456ca" Apr 17 17:03:35.693324 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:35.693308 2574 scope.go:117] "RemoveContainer" containerID="b17fca58ded9e7d845d444f07bda7b1775a10c159748e6a1f06e8b75c7469843" Apr 17 17:03:35.693537 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:35.693514 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:03:36.698067 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:36.698037 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/1.log" Apr 17 17:03:37.208819 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:37.208789 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:37.208819 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:37.208826 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:37.209261 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:37.209245 2574 scope.go:117] "RemoveContainer" containerID="b17fca58ded9e7d845d444f07bda7b1775a10c159748e6a1f06e8b75c7469843" Apr 17 17:03:37.209443 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:37.209424 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 10s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:03:45.163201 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:45.163163 2574 scope.go:117] "RemoveContainer" containerID="2ff77990cae9fd207bb3b03f76ade71699a79fcebf7a9b8066276da4198f66d3" Apr 17 17:03:45.163651 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:45.163348 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:03:50.163048 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:50.163017 2574 scope.go:117] "RemoveContainer" containerID="b17fca58ded9e7d845d444f07bda7b1775a10c159748e6a1f06e8b75c7469843" Apr 17 17:03:50.748070 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:50.747992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/2.log" Apr 17 17:03:50.748413 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:50.748397 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/1.log" Apr 17 17:03:50.748779 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:50.748758 2574 generic.go:358] "Generic (PLEG): container finished" podID="948b4586-4164-44b7-9c00-bb3340af1142" containerID="9dd569a3eb9343a58d68f112d2f2d776ed5b5ce348fb2bf4c039b97c6dcf26e5" exitCode=2 Apr 17 17:03:50.748852 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:50.748833 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerDied","Data":"9dd569a3eb9343a58d68f112d2f2d776ed5b5ce348fb2bf4c039b97c6dcf26e5"} Apr 17 17:03:50.748899 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:50.748876 2574 scope.go:117] "RemoveContainer" containerID="b17fca58ded9e7d845d444f07bda7b1775a10c159748e6a1f06e8b75c7469843" Apr 17 17:03:50.749307 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:50.749291 2574 scope.go:117] "RemoveContainer" containerID="9dd569a3eb9343a58d68f112d2f2d776ed5b5ce348fb2bf4c039b97c6dcf26e5" Apr 17 17:03:50.749533 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:50.749511 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:03:51.754009 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:51.753982 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/2.log" Apr 17 17:03:57.163071 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.163037 2574 scope.go:117] "RemoveContainer" containerID="2ff77990cae9fd207bb3b03f76ade71699a79fcebf7a9b8066276da4198f66d3" Apr 17 17:03:57.208480 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.208451 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:57.208480 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.208478 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:03:57.208928 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.208911 2574 scope.go:117] "RemoveContainer" containerID="9dd569a3eb9343a58d68f112d2f2d776ed5b5ce348fb2bf4c039b97c6dcf26e5" Apr 17 17:03:57.209134 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:57.209101 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:03:57.777674 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.777625 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/3.log" Apr 17 17:03:57.778089 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.778074 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/2.log" Apr 17 17:03:57.778397 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.778377 2574 generic.go:358] "Generic (PLEG): container finished" podID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" containerID="a354e1bb7d7f6ee806e49991ea9f422c0837ddf406e19bc838b5a3af7ba4eca9" exitCode=2 Apr 17 17:03:57.778477 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.778452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerDied","Data":"a354e1bb7d7f6ee806e49991ea9f422c0837ddf406e19bc838b5a3af7ba4eca9"} Apr 17 17:03:57.778520 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.778506 2574 scope.go:117] "RemoveContainer" containerID="2ff77990cae9fd207bb3b03f76ade71699a79fcebf7a9b8066276da4198f66d3" Apr 17 17:03:57.779042 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:57.779025 2574 scope.go:117] "RemoveContainer" containerID="a354e1bb7d7f6ee806e49991ea9f422c0837ddf406e19bc838b5a3af7ba4eca9" Apr 17 17:03:57.779288 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:03:57.779267 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:03:58.783885 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:03:58.783850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/3.log" Apr 17 17:04:03.295499 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:03.295464 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:04:03.295499 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:03.295496 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:04:03.295988 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:03.295968 2574 scope.go:117] "RemoveContainer" containerID="a354e1bb7d7f6ee806e49991ea9f422c0837ddf406e19bc838b5a3af7ba4eca9" Apr 17 17:04:03.296194 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:03.296176 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:04:11.166411 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:11.166374 2574 scope.go:117] "RemoveContainer" containerID="9dd569a3eb9343a58d68f112d2f2d776ed5b5ce348fb2bf4c039b97c6dcf26e5" Apr 17 17:04:11.838311 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:11.838285 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/3.log" Apr 17 17:04:11.838626 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:11.838612 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/2.log" Apr 17 17:04:11.838986 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:11.838964 2574 generic.go:358] "Generic (PLEG): container finished" podID="948b4586-4164-44b7-9c00-bb3340af1142" containerID="376d5eb637f6c9a792a3aaa428c070b3af515e3fe9a05f6b04f840775cf5ad07" exitCode=2 Apr 17 17:04:11.839056 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:11.839038 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerDied","Data":"376d5eb637f6c9a792a3aaa428c070b3af515e3fe9a05f6b04f840775cf5ad07"} Apr 17 17:04:11.839108 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:11.839079 2574 scope.go:117] "RemoveContainer" containerID="9dd569a3eb9343a58d68f112d2f2d776ed5b5ce348fb2bf4c039b97c6dcf26e5" Apr 17 17:04:11.839476 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:11.839459 2574 scope.go:117] "RemoveContainer" containerID="376d5eb637f6c9a792a3aaa428c070b3af515e3fe9a05f6b04f840775cf5ad07" Apr 17 17:04:11.839703 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:11.839682 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:04:12.845126 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:12.845098 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/3.log" Apr 17 17:04:17.163401 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:17.163365 2574 scope.go:117] "RemoveContainer" containerID="a354e1bb7d7f6ee806e49991ea9f422c0837ddf406e19bc838b5a3af7ba4eca9" Apr 17 17:04:17.163928 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:17.163604 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:04:17.209054 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:17.209020 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:04:17.209054 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:17.209055 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:04:17.209436 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:17.209422 2574 scope.go:117] "RemoveContainer" containerID="376d5eb637f6c9a792a3aaa428c070b3af515e3fe9a05f6b04f840775cf5ad07" Apr 17 17:04:17.209604 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:17.209589 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:04:29.163496 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:29.163419 2574 scope.go:117] "RemoveContainer" containerID="a354e1bb7d7f6ee806e49991ea9f422c0837ddf406e19bc838b5a3af7ba4eca9" Apr 17 17:04:29.163907 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:29.163606 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:04:30.162586 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:30.162551 2574 scope.go:117] "RemoveContainer" containerID="376d5eb637f6c9a792a3aaa428c070b3af515e3fe9a05f6b04f840775cf5ad07" Apr 17 17:04:30.162801 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:30.162759 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:04:40.162587 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:40.162550 2574 scope.go:117] "RemoveContainer" containerID="a354e1bb7d7f6ee806e49991ea9f422c0837ddf406e19bc838b5a3af7ba4eca9" Apr 17 17:04:40.960031 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:40.960000 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/4.log" Apr 17 17:04:40.960419 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:40.960402 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/3.log" Apr 17 17:04:40.960706 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:40.960685 2574 generic.go:358] "Generic (PLEG): container finished" podID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" exitCode=2 Apr 17 17:04:40.960778 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:40.960755 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerDied","Data":"8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9"} Apr 17 17:04:40.960820 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:40.960795 2574 scope.go:117] "RemoveContainer" containerID="a354e1bb7d7f6ee806e49991ea9f422c0837ddf406e19bc838b5a3af7ba4eca9" Apr 17 17:04:40.961222 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:40.961200 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:04:40.961434 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:40.961413 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:04:41.967067 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:41.967039 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/4.log" Apr 17 17:04:43.295822 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:43.295787 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:04:43.295822 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:43.295832 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:04:43.296241 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:43.296213 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:04:43.296415 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:43.296397 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:04:44.163435 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:44.163408 2574 scope.go:117] "RemoveContainer" containerID="376d5eb637f6c9a792a3aaa428c070b3af515e3fe9a05f6b04f840775cf5ad07" Apr 17 17:04:44.163611 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:44.163566 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:04:56.163528 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:56.163489 2574 scope.go:117] "RemoveContainer" containerID="376d5eb637f6c9a792a3aaa428c070b3af515e3fe9a05f6b04f840775cf5ad07" Apr 17 17:04:56.164055 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:56.163732 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:04:56.164055 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:56.163956 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:04:57.026514 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:57.026484 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/4.log" Apr 17 17:04:57.026859 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:57.026845 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/3.log" Apr 17 17:04:57.027144 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:57.027124 2574 generic.go:358] "Generic (PLEG): container finished" podID="948b4586-4164-44b7-9c00-bb3340af1142" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" exitCode=2 Apr 17 17:04:57.027192 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:57.027164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerDied","Data":"f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315"} Apr 17 17:04:57.027192 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:57.027191 2574 scope.go:117] "RemoveContainer" containerID="376d5eb637f6c9a792a3aaa428c070b3af515e3fe9a05f6b04f840775cf5ad07" Apr 17 17:04:57.027724 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:57.027704 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:04:57.027930 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:57.027912 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:04:57.208967 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:57.208936 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:04:57.208967 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:57.208964 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:04:58.031878 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:58.031850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/4.log" Apr 17 17:04:58.032597 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:04:58.032573 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:04:58.032815 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:04:58.032798 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:05:08.163005 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:08.162971 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:05:08.163392 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:08.163150 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:05:12.162901 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:12.162866 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:05:12.163383 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:12.163080 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:05:19.163233 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:19.163197 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:05:19.163723 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:19.163445 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:05:26.162492 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:26.162462 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:05:26.162948 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:26.162685 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:05:30.163251 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:30.163214 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:05:30.163782 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:30.163467 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:05:39.163456 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:39.163417 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:05:39.163938 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:39.163695 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:05:44.162750 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:44.162719 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:05:44.163128 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:44.162905 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:05:52.162970 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:52.162934 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:05:52.163444 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:52.163179 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:05:59.163225 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:05:59.163138 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:05:59.163719 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:05:59.163332 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:06:07.163009 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:07.162964 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:06:07.163519 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:07.163206 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:06:13.162847 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:13.162810 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:06:14.306838 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:14.306809 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/5.log" Apr 17 17:06:14.307230 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:14.307162 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/4.log" Apr 17 17:06:14.307441 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:14.307421 2574 generic.go:358] "Generic (PLEG): container finished" podID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" exitCode=2 Apr 17 17:06:14.307522 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:14.307501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" event={"ID":"e32ed82c-3f19-45c3-bc61-85aa7c599cf6","Type":"ContainerDied","Data":"8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990"} Apr 17 17:06:14.307565 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:14.307545 2574 scope.go:117] "RemoveContainer" containerID="8db443eaeeab3663f0fd4855d6689a1dbd6ea378f90ea879f44faaa014b774a9" Apr 17 17:06:14.307994 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:14.307977 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:06:14.308234 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:14.308214 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:06:15.312306 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:15.312271 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/5.log" Apr 17 17:06:19.163612 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:19.163571 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:06:20.335805 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:20.335780 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/5.log" Apr 17 17:06:20.336206 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:20.336172 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/4.log" Apr 17 17:06:20.336461 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:20.336439 2574 generic.go:358] "Generic (PLEG): container finished" podID="948b4586-4164-44b7-9c00-bb3340af1142" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" exitCode=2 Apr 17 17:06:20.336531 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:20.336511 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" event={"ID":"948b4586-4164-44b7-9c00-bb3340af1142","Type":"ContainerDied","Data":"ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384"} Apr 17 17:06:20.336604 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:20.336591 2574 scope.go:117] "RemoveContainer" containerID="f42badfd549e3a9ed4f14872e737f7ec8ae0c2c323a379a0e606c351416bb315" Apr 17 17:06:20.337108 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:20.337088 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:06:20.337320 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:20.337295 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:06:21.347114 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:21.347088 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/5.log" Apr 17 17:06:23.294833 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:23.294786 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:06:23.294833 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:23.294832 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" Apr 17 17:06:23.295453 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:23.295282 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:06:23.295519 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:23.295470 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:06:27.208727 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:27.208690 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:06:27.208727 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:27.208732 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" Apr 17 17:06:27.209232 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:27.209133 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:06:27.209333 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:27.209316 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:06:37.163579 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:37.163548 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:06:37.163999 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:37.163762 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:06:38.162875 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:38.162842 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:06:38.163046 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:38.163035 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:06:51.165721 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:51.165678 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:06:51.166235 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:51.165904 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:06:53.162747 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:06:53.162718 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:06:53.163126 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:06:53.162925 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:07:05.162524 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:05.162493 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:07:05.163144 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:05.162723 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:07:06.163441 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:06.163409 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:07:06.163868 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:06.163610 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:07:19.163235 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:19.163198 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:07:19.163770 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:19.163401 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:07:20.162737 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:20.162704 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:07:20.162959 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:20.162914 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:07:31.165769 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:31.165692 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:07:31.166191 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:31.165973 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:07:32.163292 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:32.163264 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:07:32.163474 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:32.163427 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:07:41.079075 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:41.079043 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/5.log" Apr 17 17:07:41.080426 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:41.080398 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/5.log" Apr 17 17:07:41.081398 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:41.081371 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/5.log" Apr 17 17:07:41.082514 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:41.082478 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/5.log" Apr 17 17:07:41.093775 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:41.093756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 17:07:41.095635 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:41.095603 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 17:07:43.162754 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:43.162717 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:07:43.163170 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:43.162938 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:07:46.163152 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:46.163116 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:07:46.163563 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:46.163384 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:07:50.303692 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:50.303642 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/5.log" Apr 17 17:07:50.419634 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:50.419601 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/5.log" Apr 17 17:07:50.543008 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:50.542982 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/storage-initializer/0.log" Apr 17 17:07:56.130796 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:56.130756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6b98d9f7df-fcdvc_48fa93bb-c551-41d6-b51c-48796a50bab3/manager/0.log" Apr 17 17:07:56.162931 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:56.162906 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:07:56.163134 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:56.163116 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:07:56.377705 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:56.377674 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-wdgkk_87b96799-3d10-4c24-9545-b2c210461496/postgres/0.log" Apr 17 17:07:57.152902 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.152873 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f_9b253b8b-4e0a-4890-a43a-c3f414b85c4f/extract/0.log" Apr 17 17:07:57.160887 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.160867 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f_9b253b8b-4e0a-4890-a43a-c3f414b85c4f/util/0.log" Apr 17 17:07:57.163510 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.163484 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:07:57.163756 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:07:57.163736 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:07:57.167910 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.167891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f_9b253b8b-4e0a-4890-a43a-c3f414b85c4f/pull/0.log" Apr 17 17:07:57.279630 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.279594 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr_bf2b1a44-aabd-490e-8361-7f228a0882ff/extract/0.log" Apr 17 17:07:57.284878 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.284854 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr_bf2b1a44-aabd-490e-8361-7f228a0882ff/util/0.log" Apr 17 17:07:57.290588 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.290570 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr_bf2b1a44-aabd-490e-8361-7f228a0882ff/pull/0.log" Apr 17 17:07:57.405995 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.405927 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4_fa84bb84-5411-451e-82bb-4de3b078eb23/util/0.log" Apr 17 17:07:57.413644 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.413617 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4_fa84bb84-5411-451e-82bb-4de3b078eb23/pull/0.log" Apr 17 17:07:57.428315 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.428297 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4_fa84bb84-5411-451e-82bb-4de3b078eb23/extract/0.log" Apr 17 17:07:57.540923 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.540897 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58_81e83978-694c-40d6-b6d7-672c2b14b168/util/0.log" Apr 17 17:07:57.547092 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.547068 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58_81e83978-694c-40d6-b6d7-672c2b14b168/pull/0.log" Apr 17 17:07:57.552942 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.552923 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58_81e83978-694c-40d6-b6d7-672c2b14b168/extract/0.log" Apr 17 17:07:57.784468 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.784439 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-hbw8b_af451ffc-f75b-48b4-abbf-9bd33991bcdc/manager/0.log" Apr 17 17:07:57.897504 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:57.897457 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9ggsv_028ce1af-38c0-4b4a-b042-373af0683c54/manager/0.log" Apr 17 17:07:58.135772 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:58.135690 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-t295l_f36a0158-13e3-4493-85aa-eae8dc7f088a/registry-server/0.log" Apr 17 17:07:58.260055 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:58.260022 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-ftzl6_e72a6c48-194c-468e-8cb9-befda0e033d7/manager/0.log" Apr 17 17:07:59.409027 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:07:59.408985 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-55c74c7f8d-k5d4w_1224934d-3953-4719-a6a4-8ca929c1d869/router/0.log" Apr 17 17:08:00.003013 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:00.002975 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-fdg9b_35ae9aa3-b5d8-4aad-955f-426d1900adeb/storage-initializer/0.log" Apr 17 17:08:00.011101 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:00.011065 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-fdg9b_35ae9aa3-b5d8-4aad-955f-426d1900adeb/main/0.log" Apr 17 17:08:00.129135 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:00.129097 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/storage-initializer/0.log" Apr 17 17:08:00.135407 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:00.135387 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_e32ed82c-3f19-45c3-bc61-85aa7c599cf6/main/5.log" Apr 17 17:08:00.247960 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:00.247931 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/storage-initializer/0.log" Apr 17 17:08:00.254918 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:00.254856 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_948b4586-4164-44b7-9c00-bb3340af1142/main/5.log" Apr 17 17:08:07.343260 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:07.343215 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vdlnl_7b872dd8-6b64-4209-aecb-57e4459eea02/global-pull-secret-syncer/0.log" Apr 17 17:08:07.389917 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:07.389878 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gpbll_1de234cc-f7b3-4b43-96ec-12c143fb5b33/konnectivity-agent/0.log" Apr 17 17:08:07.487855 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:07.487819 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-47.ec2.internal_7c4eaf6115bb9effc38fe56b1daa6a65/haproxy/0.log" Apr 17 17:08:08.163004 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:08.162971 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:08:08.163191 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:08:08.163138 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:08:10.163404 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:10.163366 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:08:10.163898 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:08:10.163576 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:08:11.050440 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.050345 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f_9b253b8b-4e0a-4890-a43a-c3f414b85c4f/extract/0.log" Apr 17 17:08:11.065514 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.065484 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f_9b253b8b-4e0a-4890-a43a-c3f414b85c4f/util/0.log" Apr 17 17:08:11.083918 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.083882 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xlp6f_9b253b8b-4e0a-4890-a43a-c3f414b85c4f/pull/0.log" Apr 17 17:08:11.105066 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.105037 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr_bf2b1a44-aabd-490e-8361-7f228a0882ff/extract/0.log" Apr 17 17:08:11.124299 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.124273 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr_bf2b1a44-aabd-490e-8361-7f228a0882ff/util/0.log" Apr 17 17:08:11.142802 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.142777 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wc6sr_bf2b1a44-aabd-490e-8361-7f228a0882ff/pull/0.log" Apr 17 17:08:11.166816 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.166788 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4_fa84bb84-5411-451e-82bb-4de3b078eb23/extract/0.log" Apr 17 17:08:11.183282 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.183255 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4_fa84bb84-5411-451e-82bb-4de3b078eb23/util/0.log" Apr 17 17:08:11.210491 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.210446 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vxsd4_fa84bb84-5411-451e-82bb-4de3b078eb23/pull/0.log" Apr 17 17:08:11.231369 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.231348 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58_81e83978-694c-40d6-b6d7-672c2b14b168/extract/0.log" Apr 17 17:08:11.249416 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.249397 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58_81e83978-694c-40d6-b6d7-672c2b14b168/util/0.log" Apr 17 17:08:11.272991 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.272967 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1r9x58_81e83978-694c-40d6-b6d7-672c2b14b168/pull/0.log" Apr 17 17:08:11.329817 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.329736 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-hbw8b_af451ffc-f75b-48b4-abbf-9bd33991bcdc/manager/0.log" Apr 17 17:08:11.346464 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.346443 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9ggsv_028ce1af-38c0-4b4a-b042-373af0683c54/manager/0.log" Apr 17 17:08:11.401969 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.401940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-t295l_f36a0158-13e3-4493-85aa-eae8dc7f088a/registry-server/0.log" Apr 17 17:08:11.455626 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:11.455594 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-ftzl6_e72a6c48-194c-468e-8cb9-befda0e033d7/manager/0.log" Apr 17 17:08:12.728622 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:12.728595 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_be9d9f7c-d9a5-40a6-b98d-398d14885410/alertmanager/0.log" Apr 17 17:08:12.756404 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:12.756366 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_be9d9f7c-d9a5-40a6-b98d-398d14885410/config-reloader/0.log" Apr 17 17:08:12.775540 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:12.775518 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_be9d9f7c-d9a5-40a6-b98d-398d14885410/kube-rbac-proxy-web/0.log" Apr 17 17:08:12.792594 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:12.792576 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_be9d9f7c-d9a5-40a6-b98d-398d14885410/kube-rbac-proxy/0.log" Apr 17 17:08:12.810973 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:12.810951 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_be9d9f7c-d9a5-40a6-b98d-398d14885410/kube-rbac-proxy-metric/0.log" Apr 17 17:08:12.833155 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:12.833139 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_be9d9f7c-d9a5-40a6-b98d-398d14885410/prom-label-proxy/0.log" Apr 17 17:08:12.854344 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:12.854324 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_be9d9f7c-d9a5-40a6-b98d-398d14885410/init-config-reloader/0.log" Apr 17 17:08:13.196948 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:13.196916 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-c9b8f5d79-jmwjl_70e43a28-718b-4cd3-aa08-d04ad07a8213/metrics-server/0.log" Apr 17 17:08:13.324389 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:13.324363 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jvkrm_56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e/node-exporter/0.log" Apr 17 17:08:13.340940 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:13.340913 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jvkrm_56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e/kube-rbac-proxy/0.log" Apr 17 17:08:13.356953 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:13.356882 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jvkrm_56bcbda8-7c8f-49c4-9b4f-39c52a6c6a7e/init-textfile/0.log" Apr 17 17:08:13.757216 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:13.757134 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8c79fc944-t9kcw_c7336f88-2edc-4d7e-a47a-10acf9c37205/telemeter-client/0.log" Apr 17 17:08:13.779082 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:13.779050 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8c79fc944-t9kcw_c7336f88-2edc-4d7e-a47a-10acf9c37205/reload/0.log" Apr 17 17:08:13.795923 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:13.795898 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-8c79fc944-t9kcw_c7336f88-2edc-4d7e-a47a-10acf9c37205/kube-rbac-proxy/0.log" Apr 17 17:08:15.530822 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.530791 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/1.log" Apr 17 17:08:15.539376 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.539350 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ld9dq_5247795a-9811-4fad-b182-136cc56544fd/console-operator/2.log" Apr 17 17:08:15.829489 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.829409 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r"] Apr 17 17:08:15.833040 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.833025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:15.835913 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.835893 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbgn6\"/\"kube-root-ca.crt\"" Apr 17 17:08:15.836034 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.835941 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbgn6\"/\"openshift-service-ca.crt\"" Apr 17 17:08:15.837352 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.837021 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gbgn6\"/\"default-dockercfg-s7d7q\"" Apr 17 17:08:15.839644 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.839622 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r"] Apr 17 17:08:15.916368 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.916337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-sys\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:15.916368 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.916368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-proc\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:15.916632 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.916390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kpm4\" (UniqueName: \"kubernetes.io/projected/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-kube-api-access-6kpm4\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:15.916632 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.916486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-lib-modules\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:15.916632 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:15.916579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-podres\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.004539 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.004512 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f76f76f7f-pn9z2_94dd7853-131d-41d7-bc81-3a34351d085f/console/0.log" Apr 17 17:08:16.017381 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017348 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-sys\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.017381 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-proc\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.017586 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kpm4\" (UniqueName: \"kubernetes.io/projected/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-kube-api-access-6kpm4\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.017586 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-lib-modules\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.017586 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-sys\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.017586 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-proc\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.017586 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-podres\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.017586 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017577 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-lib-modules\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.017917 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.017719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-podres\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.025755 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.025736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kpm4\" (UniqueName: \"kubernetes.io/projected/9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1-kube-api-access-6kpm4\") pod \"perf-node-gather-daemonset-hcr4r\" (UID: \"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1\") " pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.029916 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.029896 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-z564x_2b6e75b0-f8cc-4928-b513-ab3bff7a99e6/download-server/0.log" Apr 17 17:08:16.143881 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.143796 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.268873 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.268844 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r"] Apr 17 17:08:16.270493 ip-10-0-138-47 kubenswrapper[2574]: W0417 17:08:16.270463 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9b305c3a_d782_4d8c_bfa6_0a0ae9df5db1.slice/crio-66a97929e7f5c9e23246aaf08e996f7d156cdf9ccc5a3328a092e9f78d3092e5 WatchSource:0}: Error finding container 66a97929e7f5c9e23246aaf08e996f7d156cdf9ccc5a3328a092e9f78d3092e5: Status 404 returned error can't find the container with id 66a97929e7f5c9e23246aaf08e996f7d156cdf9ccc5a3328a092e9f78d3092e5 Apr 17 17:08:16.272067 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.272048 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:08:16.564209 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.564168 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-d8lj4_6975d595-5cf0-46b8-9850-ebe9f9ad039f/volume-data-source-validator/0.log" Apr 17 17:08:16.781232 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.781195 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" event={"ID":"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1","Type":"ContainerStarted","Data":"300696d3b5866dc0ca34cee175f150eb35b6be4792bdd370112d2c883069d693"} Apr 17 17:08:16.781232 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.781237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" event={"ID":"9b305c3a-d782-4d8c-bfa6-0a0ae9df5db1","Type":"ContainerStarted","Data":"66a97929e7f5c9e23246aaf08e996f7d156cdf9ccc5a3328a092e9f78d3092e5"} Apr 17 17:08:16.781447 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.781282 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:16.798905 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:16.798861 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" podStartSLOduration=1.798848277 podStartE2EDuration="1.798848277s" podCreationTimestamp="2026-04-17 17:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:08:16.795791885 +0000 UTC m=+936.195220416" watchObservedRunningTime="2026-04-17 17:08:16.798848277 +0000 UTC m=+936.198276809" Apr 17 17:08:17.370869 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:17.370837 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5zhl2_b75234e4-2fea-42eb-9534-36f51bab38ff/dns/0.log" Apr 17 17:08:17.386482 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:17.386455 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5zhl2_b75234e4-2fea-42eb-9534-36f51bab38ff/kube-rbac-proxy/0.log" Apr 17 17:08:17.487201 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:17.487165 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fphrb_fa2701e6-325c-4b24-9bcb-827a9099143e/dns-node-resolver/0.log" Apr 17 17:08:17.932785 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:17.932749 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-86499485b7-58j9g_41155a82-d32d-42cb-8659-1373cd03f8ce/registry/0.log" Apr 17 17:08:17.946438 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:17.946412 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-68lhq_22ae10bb-5884-4fb4-9c4a-473df80ffa49/node-ca/0.log" Apr 17 17:08:18.840432 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:18.840395 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-55c74c7f8d-k5d4w_1224934d-3953-4719-a6a4-8ca929c1d869/router/0.log" Apr 17 17:08:19.360623 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:19.360592 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-66g28_4aec4098-9bf2-45c1-beb6-dcd84bb6ca17/serve-healthcheck-canary/0.log" Apr 17 17:08:19.867013 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:19.866979 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cvvzl_c5365987-3342-41ca-b42f-381c6079bd7f/kube-rbac-proxy/0.log" Apr 17 17:08:19.883297 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:19.883265 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cvvzl_c5365987-3342-41ca-b42f-381c6079bd7f/exporter/0.log" Apr 17 17:08:19.901342 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:19.901318 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cvvzl_c5365987-3342-41ca-b42f-381c6079bd7f/extractor/0.log" Apr 17 17:08:21.166220 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:21.166188 2574 scope.go:117] "RemoveContainer" containerID="8bc62c7469a2b6ce8057769bd2677fa7c745bb8397b818eb1fe661907f01a990" Apr 17 17:08:21.166749 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:08:21.166386 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc_llm(e32ed82c-3f19-45c3-bc61-85aa7c599cf6)\"" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-7dc4c8ktgcc" podUID="e32ed82c-3f19-45c3-bc61-85aa7c599cf6" Apr 17 17:08:21.809189 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:21.809155 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6b98d9f7df-fcdvc_48fa93bb-c551-41d6-b51c-48796a50bab3/manager/0.log" Apr 17 17:08:21.843669 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:21.843639 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-wdgkk_87b96799-3d10-4c24-9545-b2c210461496/postgres/0.log" Apr 17 17:08:22.163682 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:22.163580 2574 scope.go:117] "RemoveContainer" containerID="ec8930d8651430d0e6f7e7167f2ebc97859d28cecd2b61c23a007cf08707c384" Apr 17 17:08:22.163924 ip-10-0-138-47 kubenswrapper[2574]: E0417 17:08:22.163899 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"main\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=main pod=facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk_llm(948b4586-4164-44b7-9c00-bb3340af1142)\"" pod="llm/facebook-opt-125m-simulated-kserve-55b5bc47fc-vkvqk" podUID="948b4586-4164-44b7-9c00-bb3340af1142" Apr 17 17:08:22.795150 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:22.795122 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gbgn6/perf-node-gather-daemonset-hcr4r" Apr 17 17:08:22.887559 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:22.887531 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-58fcc7cb5-pqvjs_95433ad1-fccf-49b7-822b-2001e954db45/manager/0.log" Apr 17 17:08:22.909044 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:22.909001 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-vs8md_62f3ffce-643f-44b0-bdc2-9782e465be26/openshift-lws-operator/0.log" Apr 17 17:08:27.080520 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:27.080489 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-rptsw_1f436f86-bd41-4ff1-840b-69ba06f80f04/migrator/0.log" Apr 17 17:08:27.101096 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:27.101070 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-rptsw_1f436f86-bd41-4ff1-840b-69ba06f80f04/graceful-termination/0.log" Apr 17 17:08:28.616942 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.616913 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d47rg_a3d26988-7d2d-401c-a249-4bebb0e0a0d6/kube-multus-additional-cni-plugins/0.log" Apr 17 17:08:28.635190 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.635162 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d47rg_a3d26988-7d2d-401c-a249-4bebb0e0a0d6/egress-router-binary-copy/0.log" Apr 17 17:08:28.653122 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.653097 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d47rg_a3d26988-7d2d-401c-a249-4bebb0e0a0d6/cni-plugins/0.log" Apr 17 17:08:28.669526 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.669495 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d47rg_a3d26988-7d2d-401c-a249-4bebb0e0a0d6/bond-cni-plugin/0.log" Apr 17 17:08:28.684857 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.684834 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d47rg_a3d26988-7d2d-401c-a249-4bebb0e0a0d6/routeoverride-cni/0.log" Apr 17 17:08:28.700802 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.700781 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d47rg_a3d26988-7d2d-401c-a249-4bebb0e0a0d6/whereabouts-cni-bincopy/0.log" Apr 17 17:08:28.717106 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.717090 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d47rg_a3d26988-7d2d-401c-a249-4bebb0e0a0d6/whereabouts-cni/0.log" Apr 17 17:08:28.752804 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.752776 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pk2lp_af693ee1-cbf2-4af0-9b87-70d6ad66e314/kube-multus/0.log" Apr 17 17:08:28.795861 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.795837 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7t66h_c9f04956-3cc9-4095-a965-b3737339bb37/network-metrics-daemon/0.log" Apr 17 17:08:28.809606 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:28.809587 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7t66h_c9f04956-3cc9-4095-a965-b3737339bb37/kube-rbac-proxy/0.log" Apr 17 17:08:29.633808 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:29.633775 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zlb2_a9562614-f113-46e2-95eb-ab53f4ee4f5d/ovn-controller/0.log" Apr 17 17:08:29.656916 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:29.656885 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zlb2_a9562614-f113-46e2-95eb-ab53f4ee4f5d/ovn-acl-logging/0.log" Apr 17 17:08:29.673633 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:29.673608 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zlb2_a9562614-f113-46e2-95eb-ab53f4ee4f5d/kube-rbac-proxy-node/0.log" Apr 17 17:08:29.688390 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:29.688346 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zlb2_a9562614-f113-46e2-95eb-ab53f4ee4f5d/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:08:29.700692 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:29.700648 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zlb2_a9562614-f113-46e2-95eb-ab53f4ee4f5d/northd/0.log" Apr 17 17:08:29.719188 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:29.719167 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zlb2_a9562614-f113-46e2-95eb-ab53f4ee4f5d/nbdb/0.log" Apr 17 17:08:29.739439 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:29.739418 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zlb2_a9562614-f113-46e2-95eb-ab53f4ee4f5d/sbdb/0.log" Apr 17 17:08:29.909268 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:29.909188 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9zlb2_a9562614-f113-46e2-95eb-ab53f4ee4f5d/ovnkube-controller/0.log" Apr 17 17:08:31.588710 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:31.588598 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4tw8h_ae8bba54-68da-4af3-803f-85e7bd8a4b87/check-endpoints/0.log" Apr 17 17:08:31.646295 ip-10-0-138-47 kubenswrapper[2574]: I0417 17:08:31.646263 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-pjg74_9ea89042-5289-40b6-8778-7ab4e248e54b/network-check-target-container/0.log"