Apr 20 19:20:32.678563 ip-10-0-134-118 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:20:33.291307 ip-10-0-134-118 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:33.291307 ip-10-0-134-118 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:20:33.291307 ip-10-0-134-118 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:33.291307 ip-10-0-134-118 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:20:33.291307 ip-10-0-134-118 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:33.293431 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.293327 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:20:33.299280 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299243 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:33.299280 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299278 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299284 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299287 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299290 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299293 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299295 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299299 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299302 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299305 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299308 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299310 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299313 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299315 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299318 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299320 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299323 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299325 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299328 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299332 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299336 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:33.299353 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299339 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299341 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299344 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299346 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299349 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299352 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299354 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299357 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299360 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299363 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299366 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299369 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299372 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299374 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299377 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299380 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299382 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299385 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299388 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299390 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:33.299842 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299393 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299396 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299399 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299401 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299403 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299406 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299408 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299411 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299413 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299415 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299418 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299421 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299423 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299425 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299429 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299432 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299435 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299437 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299440 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299443 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:33.300621 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299445 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299448 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299450 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299453 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299456 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299459 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299462 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299464 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299466 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299469 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299471 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299474 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299478 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299480 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299485 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299488 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299491 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299494 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299496 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:33.301390 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299499 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299501 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299504 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299506 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299508 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.299511 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300350 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300364 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300370 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300381 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300387 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300392 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300397 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300401 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300405 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300410 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300414 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300419 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300423 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:33.301852 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300428 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300432 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300436 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300445 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300450 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300454 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300459 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300464 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300468 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300472 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300477 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300480 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300485 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300489 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300493 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300502 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300515 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300520 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300526 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300531 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:33.302360 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300541 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300547 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300551 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300555 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300560 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300564 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300568 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300577 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300582 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300586 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300591 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300595 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300600 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300604 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300608 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300612 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300617 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300621 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300625 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300635 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:33.302891 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300639 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300643 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300651 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300655 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300659 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300663 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300667 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300673 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300679 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300683 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300689 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300698 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300704 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300708 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300713 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300717 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300722 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300727 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300731 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300736 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:33.303397 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300741 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300745 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300749 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300754 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300764 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300768 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300772 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300776 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300780 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300784 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300788 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300791 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.300801 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.300957 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.300968 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301019 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301026 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301034 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301039 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301046 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301058 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:20:33.303883 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301064 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301069 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301074 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301080 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301086 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301091 2580 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301096 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301107 2580 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301112 2580 flags.go:64] FLAG: --cloud-config="" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301117 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301122 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301129 2580 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301133 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301139 2580 flags.go:64] FLAG: --config-dir="" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301144 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301149 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301160 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301165 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301170 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301176 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301181 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301185 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301190 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301195 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301213 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:20:33.304483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301236 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301242 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301264 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301270 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301274 2580 flags.go:64] FLAG: --enable-server="true" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301279 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301287 2580 flags.go:64] FLAG: --event-burst="100" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301292 2580 flags.go:64] FLAG: --event-qps="50" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301329 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301382 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301519 2580 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301535 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301540 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301544 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301550 2580 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301553 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301556 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301559 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301562 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301567 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301570 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301573 2580 flags.go:64] FLAG: --feature-gates="" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301577 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301580 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301583 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:20:33.305088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301587 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301590 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301593 2580 flags.go:64] FLAG: --help="false" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301596 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301600 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301603 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301606 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301611 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301615 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301618 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301622 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301625 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301628 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301631 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301634 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301637 2580 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301640 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301643 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301647 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301649 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301653 2580 flags.go:64] FLAG: --lock-file="" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301656 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301659 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301662 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:20:33.305703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301668 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301671 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301674 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301677 2580 flags.go:64] FLAG: --logging-format="text" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301680 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301683 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301686 2580 flags.go:64] FLAG: --manifest-url="" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301689 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301694 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301698 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301702 2580 flags.go:64] FLAG: --max-pods="110" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301705 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301708 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301711 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301714 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301717 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301720 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301723 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301732 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301735 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301738 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301741 2580 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301744 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:20:33.306328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301749 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301752 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301756 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301759 2580 flags.go:64] FLAG: --port="10250" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301763 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301766 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0989b65322837844b" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301769 2580 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301772 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301775 2580 flags.go:64] FLAG: --register-node="true" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301777 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301780 2580 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301784 2580 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301787 2580 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301790 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301793 2580 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301796 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301800 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301803 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301807 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301809 2580 flags.go:64] FLAG: --runonce="false" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301812 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301816 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301820 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301823 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301825 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301828 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:20:33.306886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301832 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301835 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301837 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301840 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301843 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301846 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301849 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301852 2580 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301855 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301861 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301864 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301867 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301873 2580 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301875 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301878 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301881 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301884 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301887 2580 flags.go:64] FLAG: --v="2" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301892 2580 flags.go:64] FLAG: --version="false" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301896 2580 flags.go:64] FLAG: --vmodule="" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301900 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.301903 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.301999 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302003 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:33.307550 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302006 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302009 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302012 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302015 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302018 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302022 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302024 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302027 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302029 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302032 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302034 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302037 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302039 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302042 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302044 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302047 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302050 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302052 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302055 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302058 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:33.308137 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302060 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302063 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302065 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302068 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302071 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302073 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302076 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302079 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302081 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302084 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302086 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302088 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302091 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302093 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302096 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302099 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302105 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302108 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302112 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:33.308701 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302116 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302119 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302122 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302124 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302127 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302129 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302132 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302134 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302138 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302141 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302144 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302146 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302149 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302152 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302154 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302157 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302159 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302162 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302165 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302167 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:33.309159 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302170 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302172 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302175 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302177 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302179 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302182 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302185 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302189 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302192 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302196 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302199 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302202 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302204 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302207 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302209 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302212 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302215 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302217 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302220 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302222 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:33.309668 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302226 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302228 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302231 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302233 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.302236 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.302263 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.309471 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.309490 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309547 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309552 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309556 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309560 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309563 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309565 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309568 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:33.310175 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309571 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309573 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309576 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309578 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309582 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309586 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309589 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309592 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309595 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309598 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309600 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309604 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309607 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309610 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309612 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309615 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309618 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309620 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309623 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:33.310564 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309625 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309628 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309631 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309633 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309637 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309641 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309644 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309647 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309649 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309652 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309656 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309658 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309661 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309663 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309666 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309668 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309672 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309674 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309677 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309680 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:33.311040 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309682 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309685 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309688 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309690 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309693 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309695 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309698 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309700 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309703 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309705 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309708 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309710 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309713 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309716 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309718 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309721 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309724 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309727 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309730 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309733 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:33.311676 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309736 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309738 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309741 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309743 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309747 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309751 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309754 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309757 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309760 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309762 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309765 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309767 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309770 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309772 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309775 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309778 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309780 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309783 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309786 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:33.312199 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309788 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.309793 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309909 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309914 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309917 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309920 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309923 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309926 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309928 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309931 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309933 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309936 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309939 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309942 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309944 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309947 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:33.312689 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309949 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309952 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309954 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309957 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309959 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309962 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309964 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309967 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309969 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309971 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309974 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309977 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309979 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309981 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309984 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309986 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.309989 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310013 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310017 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310020 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:33.313108 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310023 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310026 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310029 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310032 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310034 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310037 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310040 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310043 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310046 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310049 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310052 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310055 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310057 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310060 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310062 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310065 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310067 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310069 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310073 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310077 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:33.313628 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310080 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310083 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310087 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310090 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310093 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310096 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310098 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310101 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310103 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310106 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310108 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310111 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310113 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310116 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310118 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310121 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310123 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310126 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310129 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:33.314109 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310132 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310134 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310137 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310140 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310143 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310145 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310147 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310150 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310152 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310155 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310157 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310160 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:33.310162 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.310167 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.310832 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:20:33.314598 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.313586 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:20:33.314968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.314472 2580 server.go:1019] "Starting client certificate rotation" Apr 20 19:20:33.314968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.314572 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:20:33.314968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.314613 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:20:33.340646 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.340619 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:20:33.348988 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.348960 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:20:33.375305 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.375281 2580 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:20:33.383152 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.383127 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:20:33.385521 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.385502 2580 log.go:25] "Validated CRI v1 image API" Apr 20 19:20:33.388968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.388946 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:20:33.391755 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.391731 2580 fs.go:135] Filesystem UUIDs: map[2aaae6fc-5241-45f7-9c9b-ecad5c4ff1a7:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 911845be-7f30-40cc-8f7f-e3dd27a381c8:/dev/nvme0n1p3] Apr 20 19:20:33.391842 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.391754 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:20:33.397115 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.396996 2580 manager.go:217] Machine: {Timestamp:2026-04-20 19:20:33.395583641 +0000 UTC m=+0.551133451 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100044 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21768b24bb7ce3d3b6b173b971d6a8 SystemUUID:ec21768b-24bb-7ce3-d3b6-b173b971d6a8 BootID:fd333773-069b-4197-b773-c424d96d5f1d Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b0:8d:41:56:87 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b0:8d:41:56:87 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:36:e7:73:c5:78:e7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:20:33.397913 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.397902 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:20:33.398009 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.397996 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:20:33.400290 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.400264 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:20:33.400434 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.400291 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-118.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:20:33.400476 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.400445 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:20:33.400476 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.400453 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:20:33.400476 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.400468 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:20:33.401982 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.401970 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:20:33.404131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.404121 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:20:33.404263 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.404237 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:20:33.407189 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.407177 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:20:33.407232 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.407201 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:20:33.407232 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.407214 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:20:33.407232 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.407228 2580 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:20:33.407360 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.407242 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:20:33.408530 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.408518 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:20:33.408570 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.408539 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:20:33.412358 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.412336 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:20:33.414464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.414451 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:20:33.416434 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416422 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416439 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416446 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416452 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416457 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416463 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416469 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416474 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416482 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:20:33.416487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416488 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:20:33.416740 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416497 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:20:33.416740 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416517 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:20:33.416740 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416547 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:20:33.416740 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.416552 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:20:33.420354 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.420342 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:20:33.420391 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.420379 2580 server.go:1295] "Started kubelet" Apr 20 19:20:33.420484 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.420445 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:20:33.420591 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.420477 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:20:33.420591 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.420548 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:20:33.421393 ip-10-0-134-118 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:20:33.421653 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.421640 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:20:33.424991 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.424300 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:20:33.424991 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.424394 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-118.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 19:20:33.424991 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.424585 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 19:20:33.424991 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.424783 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 19:20:33.440305 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.438701 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-118.ec2.internal.18a826e897e5e4fb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-118.ec2.internal,UID:ip-10-0-134-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-118.ec2.internal,},FirstTimestamp:2026-04-20 19:20:33.420354811 +0000 UTC m=+0.575904622,LastTimestamp:2026-04-20 19:20:33.420354811 +0000 UTC m=+0.575904622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-118.ec2.internal,}" Apr 20 19:20:33.444090 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.444071 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 19:20:33.445725 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.445704 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:20:33.446741 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.446725 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:20:33.447455 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447438 2580 factory.go:55] Registering systemd factory Apr 20 19:20:33.447455 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447457 2580 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:20:33.447781 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447756 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:20:33.447781 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447761 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:20:33.447928 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447788 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:20:33.447928 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447800 2580 factory.go:153] Registering CRI-O factory Apr 20 19:20:33.447928 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447813 2580 factory.go:223] Registration of the crio container factory successfully Apr 20 19:20:33.447928 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447857 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:20:33.447928 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447875 2580 factory.go:103] Registering Raw factory Apr 20 19:20:33.447928 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447890 2580 manager.go:1196] Started watching for new ooms in manager Apr 20 19:20:33.448181 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447876 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:20:33.448181 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.447968 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:33.448181 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.447979 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:20:33.448353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.448238 2580 manager.go:319] Starting recovery of all containers Apr 20 19:20:33.449867 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.449848 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8d8zl" Apr 20 19:20:33.454959 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.454913 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:20:33.457368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.457352 2580 manager.go:324] Recovery completed Apr 20 19:20:33.458908 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.458878 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 19:20:33.459051 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.459034 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8d8zl" Apr 20 19:20:33.459108 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.459047 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 19:20:33.462946 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.462932 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:33.465584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.465570 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:33.465658 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.465603 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:33.465658 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.465618 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:33.466131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.466116 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:20:33.466131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.466128 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:20:33.466233 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.466144 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:20:33.469651 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.469633 2580 policy_none.go:49] "None policy: Start" Apr 20 19:20:33.469651 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.469649 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:20:33.469805 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.469660 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:20:33.509817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.509799 2580 manager.go:341] "Starting Device Plugin manager" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.509830 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.509840 2580 server.go:85] "Starting device plugin registration server" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.510052 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.510064 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.510161 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.510244 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.510311 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.510683 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:20:33.539944 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.510723 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:33.547503 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.547451 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:20:33.547503 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.547481 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:20:33.547503 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.547496 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:20:33.547503 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.547503 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:20:33.547710 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.547563 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:20:33.550355 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.550333 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:33.610740 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.610719 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:33.611590 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.611575 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:33.611656 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.611606 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:33.611656 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.611624 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:33.611656 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.611647 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.619452 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.619438 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.619494 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.619459 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-118.ec2.internal\": node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:33.647124 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.647094 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:33.648229 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.648211 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal"] Apr 20 19:20:33.648302 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.648291 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:33.649585 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.649571 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:33.649695 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.649604 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:33.649695 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.649618 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:33.650910 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.650894 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:33.651033 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.651019 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.651080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.651049 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:33.651926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.651909 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:33.651993 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.651925 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:33.651993 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.651937 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:33.651993 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.651947 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:33.652109 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.651948 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:33.652109 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.652027 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:33.653401 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.653386 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.653482 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.653410 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:33.654153 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.654138 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:33.654217 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.654191 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:33.654217 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.654201 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:33.672992 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.672958 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-118.ec2.internal\" not found" node="ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.677386 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.677366 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-118.ec2.internal\" not found" node="ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.747213 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.747176 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:33.847681 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.847592 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:33.848809 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.848789 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3fca86c6de3ee30287aabb22e368c4de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal\" (UID: \"3fca86c6de3ee30287aabb22e368c4de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.848898 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.848824 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fca86c6de3ee30287aabb22e368c4de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal\" (UID: \"3fca86c6de3ee30287aabb22e368c4de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.848898 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.848852 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/eb80aa0e1c4282d53926185e2c1c5fe9-config\") pod \"kube-apiserver-proxy-ip-10-0-134-118.ec2.internal\" (UID: \"eb80aa0e1c4282d53926185e2c1c5fe9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.948614 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:33.948580 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:33.949761 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.949737 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3fca86c6de3ee30287aabb22e368c4de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal\" (UID: \"3fca86c6de3ee30287aabb22e368c4de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.949849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.949763 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3fca86c6de3ee30287aabb22e368c4de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal\" (UID: \"3fca86c6de3ee30287aabb22e368c4de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.949849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.949823 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fca86c6de3ee30287aabb22e368c4de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal\" (UID: \"3fca86c6de3ee30287aabb22e368c4de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.949849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.949841 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/eb80aa0e1c4282d53926185e2c1c5fe9-config\") pod \"kube-apiserver-proxy-ip-10-0-134-118.ec2.internal\" (UID: \"eb80aa0e1c4282d53926185e2c1c5fe9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.949974 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.949880 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/eb80aa0e1c4282d53926185e2c1c5fe9-config\") pod \"kube-apiserver-proxy-ip-10-0-134-118.ec2.internal\" (UID: \"eb80aa0e1c4282d53926185e2c1c5fe9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.949974 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.949890 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fca86c6de3ee30287aabb22e368c4de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal\" (UID: \"3fca86c6de3ee30287aabb22e368c4de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.975838 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.975817 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:33.979508 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:33.979487 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" Apr 20 19:20:34.049489 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:34.049447 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:34.149994 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:34.149916 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:34.250512 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:34.250476 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:34.314873 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.314838 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:20:34.315530 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.315009 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:20:34.351609 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:34.351421 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:34.446381 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.446296 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:20:34.452550 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:34.452529 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-118.ec2.internal\" not found" Apr 20 19:20:34.460938 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.460912 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:15:33 +0000 UTC" deadline="2027-09-27 12:15:39.549187454 +0000 UTC" Apr 20 19:20:34.460938 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.460937 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12592h55m5.088252964s" Apr 20 19:20:34.461398 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.461386 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:20:34.473372 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.473352 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:34.502095 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.502070 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sjrgw" Apr 20 19:20:34.502627 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.502611 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:34.517784 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.517763 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sjrgw" Apr 20 19:20:34.547984 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.547949 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" Apr 20 19:20:34.564023 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.563990 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:20:34.564980 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.564963 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" Apr 20 19:20:34.574969 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.574952 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:20:34.635560 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:34.635525 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb80aa0e1c4282d53926185e2c1c5fe9.slice/crio-0cadfa89fd595ef0bce0186fa758a1541f51d4617a2d2e1c47f438dc234239b7 WatchSource:0}: Error finding container 0cadfa89fd595ef0bce0186fa758a1541f51d4617a2d2e1c47f438dc234239b7: Status 404 returned error can't find the container with id 0cadfa89fd595ef0bce0186fa758a1541f51d4617a2d2e1c47f438dc234239b7 Apr 20 19:20:34.636100 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:34.636083 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fca86c6de3ee30287aabb22e368c4de.slice/crio-0a8f94c128d9c03a281d4a1d40743d2d22b3ff5bc74cb17bcc42760e04d0223a WatchSource:0}: Error finding container 0a8f94c128d9c03a281d4a1d40743d2d22b3ff5bc74cb17bcc42760e04d0223a: Status 404 returned error can't find the container with id 0a8f94c128d9c03a281d4a1d40743d2d22b3ff5bc74cb17bcc42760e04d0223a Apr 20 19:20:34.640725 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.640701 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:20:34.753357 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:34.753239 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:35.268626 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.268598 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:35.408690 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.408654 2580 apiserver.go:52] "Watching apiserver" Apr 20 19:20:35.417198 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.417164 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:20:35.419222 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.419122 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal","openshift-multus/network-metrics-daemon-mw5qh","openshift-network-diagnostics/network-check-target-zldgh","openshift-network-operator/iptables-alerter-lnx79","kube-system/konnectivity-agent-z6jmk","kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn","openshift-cluster-node-tuning-operator/tuned-4g46x","openshift-multus/multus-additional-cni-plugins-l87ws","openshift-multus/multus-bcggv","openshift-ovn-kubernetes/ovnkube-node-d9tnf","openshift-image-registry/node-ca-fkcml"] Apr 20 19:20:35.422600 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.422221 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:35.422600 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:35.422334 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:35.424706 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.424588 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:35.424706 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:35.424650 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:35.426766 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.426744 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.430157 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.429008 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:20:35.430157 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.429317 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:20:35.430157 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.429518 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:20:35.430157 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.429717 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-w4wnq\"" Apr 20 19:20:35.431243 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.431223 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:35.433956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.433465 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d95rz\"" Apr 20 19:20:35.433956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.433544 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.433956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.433465 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:20:35.434150 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.434054 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:20:35.436330 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.435422 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:20:35.436330 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.435729 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:20:35.436330 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.435901 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7lzxz\"" Apr 20 19:20:35.436330 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.436080 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:20:35.437808 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.437244 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.439641 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.439617 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6gmb5\"" Apr 20 19:20:35.439870 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.439829 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:20:35.440099 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.440061 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:20:35.441684 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.441628 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.441761 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.441709 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.443815 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.443772 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:20:35.443967 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.443944 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:20:35.444094 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.444078 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kmwq7\"" Apr 20 19:20:35.444180 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.444166 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:20:35.444637 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.444296 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:20:35.444637 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.444429 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:20:35.444637 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.444494 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wgwwp\"" Apr 20 19:20:35.444836 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.444642 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.444836 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.444700 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:20:35.446959 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.446938 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.448760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.448066 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:20:35.448760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.448187 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s26cl\"" Apr 20 19:20:35.448760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.448375 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:20:35.448760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.448456 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:20:35.448760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.448634 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:20:35.448760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.448653 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:20:35.448760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.448662 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:20:35.449215 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.449185 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:20:35.449332 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.449304 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:20:35.449451 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.449432 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-hblxf\"" Apr 20 19:20:35.449526 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.449509 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:20:35.449758 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.449740 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:20:35.456077 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456049 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjgf7\" (UniqueName: \"kubernetes.io/projected/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-kube-api-access-vjgf7\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.456202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456091 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-cni-binary-copy\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.456202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456117 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.456202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456138 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-run\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456162 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-hostroot\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-slash\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-system-cni-dir\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456291 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456314 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-cni-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456340 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-netns\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456362 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456410 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-node-log\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456450 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmtzx\" (UniqueName: \"kubernetes.io/projected/58118261-be4f-4f34-96ae-d918e3128ec4-kube-api-access-vmtzx\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.456478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456477 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/297f9c02-f4d5-419e-916b-2590d0104a7b-host-slash\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456510 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-modprobe-d\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456544 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysctl-conf\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456564 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-sys\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456584 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-host\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456605 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-tuned\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456631 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-daemon-config\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456653 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysctl-d\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456676 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbbxz\" (UniqueName: \"kubernetes.io/projected/9411047e-7c19-48c0-8520-4e2ff4ee80ec-kube-api-access-fbbxz\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456719 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-os-release\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456751 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cb98fcc4-ee47-45ed-bae3-05703748d0df-cni-binary-copy\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456778 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-cni-bin\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456810 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-etc-kubernetes\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-conf-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-systemd\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456896 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456929 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9411047e-7c19-48c0-8520-4e2ff4ee80ec-tmp\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.456958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456952 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-run-netns\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.456983 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-log-socket\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457008 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfpwj\" (UniqueName: \"kubernetes.io/projected/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-kube-api-access-qfpwj\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457031 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx2kx\" (UniqueName: \"kubernetes.io/projected/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-kube-api-access-zx2kx\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-var-lib-kubelet\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457078 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-multus-certs\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457105 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457131 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovnkube-config\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457153 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43ca0838-f833-4608-bf4f-d6f498c3c609-host\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457177 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-systemd-units\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457201 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-ovn\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-socket-dir-parent\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457266 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43ca0838-f833-4608-bf4f-d6f498c3c609-serviceca\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457293 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjnp\" (UniqueName: \"kubernetes.io/projected/43ca0838-f833-4608-bf4f-d6f498c3c609-kube-api-access-khjnp\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457341 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-cnibin\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.457754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457364 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f817bf27-7a92-45a2-acd6-3f63fdbb765d-konnectivity-ca\") pod \"konnectivity-agent-z6jmk\" (UID: \"f817bf27-7a92-45a2-acd6-3f63fdbb765d\") " pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457400 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-lib-modules\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457423 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-system-cni-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457446 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-etc-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457470 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-cni-bin\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457495 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-device-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457518 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-kubelet\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457542 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovn-node-metrics-cert\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-os-release\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457588 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysconfig\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457612 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/297f9c02-f4d5-419e-916b-2590d0104a7b-iptables-alerter-script\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457637 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-kubernetes\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-kubelet\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457686 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-registration-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457720 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-sys-fs\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:35.458491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457776 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpn9\" (UniqueName: \"kubernetes.io/projected/297f9c02-f4d5-419e-916b-2590d0104a7b-kube-api-access-bvpn9\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457811 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-systemd\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457838 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxv4f\" (UniqueName: \"kubernetes.io/projected/cb98fcc4-ee47-45ed-bae3-05703748d0df-kube-api-access-cxv4f\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457863 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-run-ovn-kubernetes\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457887 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-cnibin\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457910 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-cni-multus\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-var-lib-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457967 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-cni-netd\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.457994 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovnkube-script-lib\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.458023 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.458057 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-k8s-cni-cncf-io\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.458082 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-env-overrides\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.458106 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f817bf27-7a92-45a2-acd6-3f63fdbb765d-agent-certs\") pod \"konnectivity-agent-z6jmk\" (UID: \"f817bf27-7a92-45a2-acd6-3f63fdbb765d\") " pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:35.459193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.458131 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-socket-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.519392 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.519324 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:15:34 +0000 UTC" deadline="2027-10-03 13:38:51.520767151 +0000 UTC" Apr 20 19:20:35.519392 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.519356 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12738h18m16.001414558s" Apr 20 19:20:35.553419 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.553359 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" event={"ID":"eb80aa0e1c4282d53926185e2c1c5fe9","Type":"ContainerStarted","Data":"0cadfa89fd595ef0bce0186fa758a1541f51d4617a2d2e1c47f438dc234239b7"} Apr 20 19:20:35.554718 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.554692 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" event={"ID":"3fca86c6de3ee30287aabb22e368c4de","Type":"ContainerStarted","Data":"0a8f94c128d9c03a281d4a1d40743d2d22b3ff5bc74cb17bcc42760e04d0223a"} Apr 20 19:20:35.558717 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-systemd-units\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.558837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558732 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-ovn\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.558837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558759 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-socket-dir-parent\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.558837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43ca0838-f833-4608-bf4f-d6f498c3c609-serviceca\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.558837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558813 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-ovn\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.558837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-systemd-units\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.559089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558859 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-socket-dir-parent\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.559089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558904 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khjnp\" (UniqueName: \"kubernetes.io/projected/43ca0838-f833-4608-bf4f-d6f498c3c609-kube-api-access-khjnp\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.559089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558938 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:35.559089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.558967 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-cnibin\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.559089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559029 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-cnibin\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559141 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f817bf27-7a92-45a2-acd6-3f63fdbb765d-konnectivity-ca\") pod \"konnectivity-agent-z6jmk\" (UID: \"f817bf27-7a92-45a2-acd6-3f63fdbb765d\") " pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-lib-modules\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559208 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-system-cni-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-etc-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559275 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-cni-bin\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559300 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-device-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-etc-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559324 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-kubelet\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.559353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-system-cni-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovn-node-metrics-cert\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559389 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-os-release\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559391 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-lib-modules\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559426 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysconfig\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559455 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/297f9c02-f4d5-419e-916b-2590d0104a7b-iptables-alerter-script\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559467 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-os-release\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559480 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-kubernetes\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559496 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-kubelet\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-registration-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-sys-fs\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559573 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpn9\" (UniqueName: \"kubernetes.io/projected/297f9c02-f4d5-419e-916b-2590d0104a7b-kube-api-access-bvpn9\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-kubernetes\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559589 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-systemd\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559612 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxv4f\" (UniqueName: \"kubernetes.io/projected/cb98fcc4-ee47-45ed-bae3-05703748d0df-kube-api-access-cxv4f\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-run-ovn-kubernetes\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.559715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-cnibin\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559716 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-cni-multus\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559719 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f817bf27-7a92-45a2-acd6-3f63fdbb765d-konnectivity-ca\") pod \"konnectivity-agent-z6jmk\" (UID: \"f817bf27-7a92-45a2-acd6-3f63fdbb765d\") " pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559736 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-sys-fs\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559741 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-var-lib-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559391 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-device-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-cni-netd\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559363 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-cni-bin\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559397 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-kubelet\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovnkube-script-lib\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-kubelet\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559830 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559846 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559857 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-k8s-cni-cncf-io\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:35.559862 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559865 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-cni-netd\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559515 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysconfig\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559880 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-env-overrides\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.560554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559904 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f817bf27-7a92-45a2-acd6-3f63fdbb765d-agent-certs\") pod \"konnectivity-agent-z6jmk\" (UID: \"f817bf27-7a92-45a2-acd6-3f63fdbb765d\") " pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559860 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/43ca0838-f833-4608-bf4f-d6f498c3c609-serviceca\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559916 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-run-ovn-kubernetes\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.559777 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-var-lib-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:35.559949 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:36.059904901 +0000 UTC m=+3.215454718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560109 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-systemd\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560155 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-k8s-cni-cncf-io\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560206 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-registration-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560300 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-cnibin\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560351 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-env-overrides\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560617 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/297f9c02-f4d5-419e-916b-2590d0104a7b-iptables-alerter-script\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560647 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-cni-multus\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560712 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-socket-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjgf7\" (UniqueName: \"kubernetes.io/projected/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-kube-api-access-vjgf7\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560770 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-cni-binary-copy\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.561368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560822 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovnkube-script-lib\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560827 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-socket-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-run\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560864 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-run\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560868 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-hostroot\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560897 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-slash\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560901 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-hostroot\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560922 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560948 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-system-cni-dir\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560976 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560998 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-cni-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.560997 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-slash\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-netns\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561052 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-netns\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561080 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561098 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561108 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-node-log\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.562079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561134 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmtzx\" (UniqueName: \"kubernetes.io/projected/58118261-be4f-4f34-96ae-d918e3128ec4-kube-api-access-vmtzx\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561161 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/297f9c02-f4d5-419e-916b-2590d0104a7b-host-slash\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561184 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-modprobe-d\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561212 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysctl-conf\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561223 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-cni-binary-copy\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-sys\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561136 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58118261-be4f-4f34-96ae-d918e3128ec4-system-cni-dir\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-cni-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-sys\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561333 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-openvswitch\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-host\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561391 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-tuned\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561418 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-daemon-config\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561426 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-modprobe-d\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561447 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysctl-d\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561463 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbbxz\" (UniqueName: \"kubernetes.io/projected/9411047e-7c19-48c0-8520-4e2ff4ee80ec-kube-api-access-fbbxz\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561478 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-os-release\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.562817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561494 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cb98fcc4-ee47-45ed-bae3-05703748d0df-cni-binary-copy\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561510 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-cni-bin\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561525 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-etc-kubernetes\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561540 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-conf-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-systemd\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561570 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561572 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-host\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561588 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9411047e-7c19-48c0-8520-4e2ff4ee80ec-tmp\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561605 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-run-netns\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561628 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-log-socket\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561653 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfpwj\" (UniqueName: \"kubernetes.io/projected/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-kube-api-access-qfpwj\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysctl-conf\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx2kx\" (UniqueName: \"kubernetes.io/projected/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-kube-api-access-zx2kx\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-var-lib-kubelet\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561733 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-multus-certs\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561759 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561764 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-conf-dir\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561787 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovnkube-config\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.571968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561812 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43ca0838-f833-4608-bf4f-d6f498c3c609-host\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561890 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-sysctl-d\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561901 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/43ca0838-f833-4608-bf4f-d6f498c3c609-host\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561947 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-run-netns\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561958 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-os-release\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-node-log\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561960 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58118261-be4f-4f34-96ae-d918e3128ec4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.561975 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-log-socket\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562004 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-run-systemd\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562009 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-run-multus-certs\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562015 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-etc-kubernetes\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562017 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cb98fcc4-ee47-45ed-bae3-05703748d0df-multus-daemon-config\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562026 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb98fcc4-ee47-45ed-bae3-05703748d0df-host-var-lib-cni-bin\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562068 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562094 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/297f9c02-f4d5-419e-916b-2590d0104a7b-host-slash\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562131 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-etc-selinux\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562328 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9411047e-7c19-48c0-8520-4e2ff4ee80ec-var-lib-kubelet\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562460 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cb98fcc4-ee47-45ed-bae3-05703748d0df-cni-binary-copy\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.572926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.562859 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovnkube-config\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.573695 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.564068 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-ovn-node-metrics-cert\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.573695 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.564152 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f817bf27-7a92-45a2-acd6-3f63fdbb765d-agent-certs\") pod \"konnectivity-agent-z6jmk\" (UID: \"f817bf27-7a92-45a2-acd6-3f63fdbb765d\") " pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:35.573695 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.564949 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9411047e-7c19-48c0-8520-4e2ff4ee80ec-etc-tuned\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.573695 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.565936 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9411047e-7c19-48c0-8520-4e2ff4ee80ec-tmp\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.573695 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:35.573346 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:35.573695 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:35.573365 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:35.573695 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:35.573379 2580 projected.go:194] Error preparing data for projected volume kube-api-access-shr5h for pod openshift-network-diagnostics/network-check-target-zldgh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:35.573695 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:35.573441 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h podName:737ef4d2-9017-4b74-b25c-6478eda78bb1 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:36.073421633 +0000 UTC m=+3.228971447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-shr5h" (UniqueName: "kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h") pod "network-check-target-zldgh" (UID: "737ef4d2-9017-4b74-b25c-6478eda78bb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:35.574536 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.574438 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbbxz\" (UniqueName: \"kubernetes.io/projected/9411047e-7c19-48c0-8520-4e2ff4ee80ec-kube-api-access-fbbxz\") pod \"tuned-4g46x\" (UID: \"9411047e-7c19-48c0-8520-4e2ff4ee80ec\") " pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.575614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.575587 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjnp\" (UniqueName: \"kubernetes.io/projected/43ca0838-f833-4608-bf4f-d6f498c3c609-kube-api-access-khjnp\") pod \"node-ca-fkcml\" (UID: \"43ca0838-f833-4608-bf4f-d6f498c3c609\") " pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:35.575710 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.575629 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxv4f\" (UniqueName: \"kubernetes.io/projected/cb98fcc4-ee47-45ed-bae3-05703748d0df-kube-api-access-cxv4f\") pod \"multus-bcggv\" (UID: \"cb98fcc4-ee47-45ed-bae3-05703748d0df\") " pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.576565 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.576536 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjgf7\" (UniqueName: \"kubernetes.io/projected/e4b6d1e4-212d-4b20-b151-bd58fb0830d1-kube-api-access-vjgf7\") pod \"aws-ebs-csi-driver-node-fpxmn\" (UID: \"e4b6d1e4-212d-4b20-b151-bd58fb0830d1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.576565 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.576556 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpn9\" (UniqueName: \"kubernetes.io/projected/297f9c02-f4d5-419e-916b-2590d0104a7b-kube-api-access-bvpn9\") pod \"iptables-alerter-lnx79\" (UID: \"297f9c02-f4d5-419e-916b-2590d0104a7b\") " pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.578268 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.578226 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx2kx\" (UniqueName: \"kubernetes.io/projected/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-kube-api-access-zx2kx\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:35.578699 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.578614 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmtzx\" (UniqueName: \"kubernetes.io/projected/58118261-be4f-4f34-96ae-d918e3128ec4-kube-api-access-vmtzx\") pod \"multus-additional-cni-plugins-l87ws\" (UID: \"58118261-be4f-4f34-96ae-d918e3128ec4\") " pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.579348 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.579325 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfpwj\" (UniqueName: \"kubernetes.io/projected/9ab31a48-0dcf-4d61-94cd-b05c680c9b49-kube-api-access-qfpwj\") pod \"ovnkube-node-d9tnf\" (UID: \"9ab31a48-0dcf-4d61-94cd-b05c680c9b49\") " pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.742395 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.742345 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lnx79" Apr 20 19:20:35.751150 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.751124 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:35.765944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.765922 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" Apr 20 19:20:35.770630 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.770575 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4g46x" Apr 20 19:20:35.780244 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.780226 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-l87ws" Apr 20 19:20:35.787868 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.787848 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bcggv" Apr 20 19:20:35.794542 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.794525 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:35.806115 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:35.806088 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fkcml" Apr 20 19:20:36.022980 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.022889 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-frnjg"] Apr 20 19:20:36.025905 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.025884 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.026024 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.025965 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:36.066183 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.066131 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-kubelet-config\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.066365 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.066207 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-dbus\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.066365 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.066268 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:36.066365 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.066300 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.066547 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.066380 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:36.066547 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.066456 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:37.06643764 +0000 UTC m=+4.221987452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:36.167172 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.167133 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-dbus\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.167349 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.167191 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.167349 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.167237 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-kubelet-config\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.167349 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.167277 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:36.167349 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.167316 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-dbus\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.167595 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.167397 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-kubelet-config\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.167595 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.167398 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:36.167595 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.167471 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret podName:1e64ae2b-e7d8-473c-9c0c-c27208349a5c nodeName:}" failed. No retries permitted until 2026-04-20 19:20:36.667452338 +0000 UTC m=+3.823002137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret") pod "global-pull-secret-syncer-frnjg" (UID: "1e64ae2b-e7d8-473c-9c0c-c27208349a5c") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:36.167595 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.167405 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:36.167595 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.167500 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:36.167595 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.167523 2580 projected.go:194] Error preparing data for projected volume kube-api-access-shr5h for pod openshift-network-diagnostics/network-check-target-zldgh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:36.167595 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.167573 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h podName:737ef4d2-9017-4b74-b25c-6478eda78bb1 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:37.167557535 +0000 UTC m=+4.323107332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-shr5h" (UniqueName: "kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h") pod "network-check-target-zldgh" (UID: "737ef4d2-9017-4b74-b25c-6478eda78bb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:36.491941 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:36.491913 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab31a48_0dcf_4d61_94cd_b05c680c9b49.slice/crio-b1b08a6a1403a8176988e84d0c6d402b4883405605b8f994f30e1f7c0419fddf WatchSource:0}: Error finding container b1b08a6a1403a8176988e84d0c6d402b4883405605b8f994f30e1f7c0419fddf: Status 404 returned error can't find the container with id b1b08a6a1403a8176988e84d0c6d402b4883405605b8f994f30e1f7c0419fddf Apr 20 19:20:36.493832 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:36.493806 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ca0838_f833_4608_bf4f_d6f498c3c609.slice/crio-6149cffa39a78455598b4060305e2e55bd0ee5ebb469d363186bc2f40c5e6800 WatchSource:0}: Error finding container 6149cffa39a78455598b4060305e2e55bd0ee5ebb469d363186bc2f40c5e6800: Status 404 returned error can't find the container with id 6149cffa39a78455598b4060305e2e55bd0ee5ebb469d363186bc2f40c5e6800 Apr 20 19:20:36.494513 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:36.494477 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9411047e_7c19_48c0_8520_4e2ff4ee80ec.slice/crio-2a31856ba544b25420a66c46e03bdd203c4e2cab65aa8a5c8e5e8dd71e039227 WatchSource:0}: Error finding container 2a31856ba544b25420a66c46e03bdd203c4e2cab65aa8a5c8e5e8dd71e039227: Status 404 returned error can't find the container with id 2a31856ba544b25420a66c46e03bdd203c4e2cab65aa8a5c8e5e8dd71e039227 Apr 20 19:20:36.496306 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:36.496285 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb98fcc4_ee47_45ed_bae3_05703748d0df.slice/crio-de585b653dd2075e6337b9a914ac35c5fc44bcb430771584fb3110776b9819d1 WatchSource:0}: Error finding container de585b653dd2075e6337b9a914ac35c5fc44bcb430771584fb3110776b9819d1: Status 404 returned error can't find the container with id de585b653dd2075e6337b9a914ac35c5fc44bcb430771584fb3110776b9819d1 Apr 20 19:20:36.499335 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:36.499145 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf817bf27_7a92_45a2_acd6_3f63fdbb765d.slice/crio-545b37178dcf3db34e32b8d16c73bf87f014025739454908ecc32f966c76cb54 WatchSource:0}: Error finding container 545b37178dcf3db34e32b8d16c73bf87f014025739454908ecc32f966c76cb54: Status 404 returned error can't find the container with id 545b37178dcf3db34e32b8d16c73bf87f014025739454908ecc32f966c76cb54 Apr 20 19:20:36.520381 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.520290 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:15:34 +0000 UTC" deadline="2028-01-14 18:05:10.980594444 +0000 UTC" Apr 20 19:20:36.520381 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.520324 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15214h44m34.460273971s" Apr 20 19:20:36.556828 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.556794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" event={"ID":"e4b6d1e4-212d-4b20-b151-bd58fb0830d1","Type":"ContainerStarted","Data":"b5f46e979224798a95210e9a7073bf32cded6bcba1396910955273aa1a8f361d"} Apr 20 19:20:36.557696 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.557671 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lnx79" event={"ID":"297f9c02-f4d5-419e-916b-2590d0104a7b","Type":"ContainerStarted","Data":"37a515001febcded10dd30b1bc0d983504104addf123750de6fd3292c9de9f70"} Apr 20 19:20:36.558649 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.558622 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4g46x" event={"ID":"9411047e-7c19-48c0-8520-4e2ff4ee80ec","Type":"ContainerStarted","Data":"2a31856ba544b25420a66c46e03bdd203c4e2cab65aa8a5c8e5e8dd71e039227"} Apr 20 19:20:36.559603 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.559579 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fkcml" event={"ID":"43ca0838-f833-4608-bf4f-d6f498c3c609","Type":"ContainerStarted","Data":"6149cffa39a78455598b4060305e2e55bd0ee5ebb469d363186bc2f40c5e6800"} Apr 20 19:20:36.560566 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.560545 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerStarted","Data":"092c6eba27d0b29a0acb83fe32ef9792f2a23ff3cf612ad9fb48b46247524888"} Apr 20 19:20:36.561485 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.561464 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z6jmk" event={"ID":"f817bf27-7a92-45a2-acd6-3f63fdbb765d","Type":"ContainerStarted","Data":"545b37178dcf3db34e32b8d16c73bf87f014025739454908ecc32f966c76cb54"} Apr 20 19:20:36.562614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.562595 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bcggv" event={"ID":"cb98fcc4-ee47-45ed-bae3-05703748d0df","Type":"ContainerStarted","Data":"de585b653dd2075e6337b9a914ac35c5fc44bcb430771584fb3110776b9819d1"} Apr 20 19:20:36.563547 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.563526 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"b1b08a6a1403a8176988e84d0c6d402b4883405605b8f994f30e1f7c0419fddf"} Apr 20 19:20:36.670403 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.670362 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:36.670552 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.670514 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:36.670592 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:36.670582 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret podName:1e64ae2b-e7d8-473c-9c0c-c27208349a5c nodeName:}" failed. No retries permitted until 2026-04-20 19:20:37.670567502 +0000 UTC m=+4.826117304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret") pod "global-pull-secret-syncer-frnjg" (UID: "1e64ae2b-e7d8-473c-9c0c-c27208349a5c") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:36.793548 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.792796 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-78xk2"] Apr 20 19:20:36.795524 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.795498 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:36.798269 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.797872 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:20:36.798269 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.797947 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:20:36.798269 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.798103 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sqxxc\"" Apr 20 19:20:36.872427 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.872379 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-hosts-file\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:36.872612 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.872449 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-tmp-dir\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:36.872612 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.872477 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6sqf\" (UniqueName: \"kubernetes.io/projected/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-kube-api-access-x6sqf\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:36.973606 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.973568 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-hosts-file\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:36.973753 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.973631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-tmp-dir\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:36.973753 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.973659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6sqf\" (UniqueName: \"kubernetes.io/projected/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-kube-api-access-x6sqf\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:36.974048 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.974022 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-hosts-file\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:36.974376 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:36.974346 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-tmp-dir\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:37.001118 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.001089 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6sqf\" (UniqueName: \"kubernetes.io/projected/ab0118c6-b394-43bc-bf6d-fb41c2beb9d1-kube-api-access-x6sqf\") pod \"node-resolver-78xk2\" (UID: \"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1\") " pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:37.074855 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.074771 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:37.080470 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.080438 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:37.080626 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.080545 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:39.080521472 +0000 UTC m=+6.236071280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:37.105413 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.105376 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-78xk2" Apr 20 19:20:37.128158 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:20:37.128120 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab0118c6_b394_43bc_bf6d_fb41c2beb9d1.slice/crio-b895ac0ddbddf384f2383975a607f743e4b14e578fd0f7ffdcd3698abb4a67ca WatchSource:0}: Error finding container b895ac0ddbddf384f2383975a607f743e4b14e578fd0f7ffdcd3698abb4a67ca: Status 404 returned error can't find the container with id b895ac0ddbddf384f2383975a607f743e4b14e578fd0f7ffdcd3698abb4a67ca Apr 20 19:20:37.176226 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.176191 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:37.176434 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.176416 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:37.176497 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.176443 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:37.176497 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.176456 2580 projected.go:194] Error preparing data for projected volume kube-api-access-shr5h for pod openshift-network-diagnostics/network-check-target-zldgh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:37.176651 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.176522 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h podName:737ef4d2-9017-4b74-b25c-6478eda78bb1 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:39.176494299 +0000 UTC m=+6.332044110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-shr5h" (UniqueName: "kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h") pod "network-check-target-zldgh" (UID: "737ef4d2-9017-4b74-b25c-6478eda78bb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:37.550334 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.550289 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:37.550767 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.550430 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:37.550827 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.550813 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:37.550961 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.550876 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:37.550961 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.550932 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:37.551096 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.550977 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:37.577774 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.577680 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" event={"ID":"eb80aa0e1c4282d53926185e2c1c5fe9","Type":"ContainerStarted","Data":"33eb4bcbddcec2c1e69a8e64148e8a2163880287eae5796242130b1562e4af9c"} Apr 20 19:20:37.585445 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.585408 2580 generic.go:358] "Generic (PLEG): container finished" podID="3fca86c6de3ee30287aabb22e368c4de" containerID="a77ecb69b3c10be281b2a1c683dcc27384e5a2a37c992fdf2aa99c87a5655f66" exitCode=0 Apr 20 19:20:37.585605 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.585510 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" event={"ID":"3fca86c6de3ee30287aabb22e368c4de","Type":"ContainerDied","Data":"a77ecb69b3c10be281b2a1c683dcc27384e5a2a37c992fdf2aa99c87a5655f66"} Apr 20 19:20:37.593354 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.593324 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-78xk2" event={"ID":"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1","Type":"ContainerStarted","Data":"b895ac0ddbddf384f2383975a607f743e4b14e578fd0f7ffdcd3698abb4a67ca"} Apr 20 19:20:37.603858 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.603756 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-118.ec2.internal" podStartSLOduration=3.603737381 podStartE2EDuration="3.603737381s" podCreationTimestamp="2026-04-20 19:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:20:37.590835166 +0000 UTC m=+4.746384986" watchObservedRunningTime="2026-04-20 19:20:37.603737381 +0000 UTC m=+4.759287201" Apr 20 19:20:37.679977 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:37.679943 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:37.680108 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.680092 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:37.680184 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:37.680165 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret podName:1e64ae2b-e7d8-473c-9c0c-c27208349a5c nodeName:}" failed. No retries permitted until 2026-04-20 19:20:39.680143752 +0000 UTC m=+6.835693552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret") pod "global-pull-secret-syncer-frnjg" (UID: "1e64ae2b-e7d8-473c-9c0c-c27208349a5c") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:38.623062 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:38.623026 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" event={"ID":"3fca86c6de3ee30287aabb22e368c4de","Type":"ContainerStarted","Data":"d7dafd163f6463e69364ad1412de29f57ea706d934b9fbe90ea92b22c3d6728d"} Apr 20 19:20:39.094945 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:39.094295 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:39.094945 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.094496 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:39.094945 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.094561 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:43.094541698 +0000 UTC m=+10.250091513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:39.195968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:39.195727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:39.195968 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.195883 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:39.195968 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.195904 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:39.195968 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.195917 2580 projected.go:194] Error preparing data for projected volume kube-api-access-shr5h for pod openshift-network-diagnostics/network-check-target-zldgh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:39.195968 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.195974 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h podName:737ef4d2-9017-4b74-b25c-6478eda78bb1 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:43.195956075 +0000 UTC m=+10.351505878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-shr5h" (UniqueName: "kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h") pod "network-check-target-zldgh" (UID: "737ef4d2-9017-4b74-b25c-6478eda78bb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:39.548152 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:39.548009 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:39.548319 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.548157 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:39.548668 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:39.548434 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:39.548668 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.548549 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:39.548668 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:39.548009 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:39.548668 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.548620 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:39.700024 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:39.699985 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:39.700505 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.700150 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:39.700505 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:39.700212 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret podName:1e64ae2b-e7d8-473c-9c0c-c27208349a5c nodeName:}" failed. No retries permitted until 2026-04-20 19:20:43.700192344 +0000 UTC m=+10.855742155 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret") pod "global-pull-secret-syncer-frnjg" (UID: "1e64ae2b-e7d8-473c-9c0c-c27208349a5c") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:41.550403 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:41.550374 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:41.550891 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:41.550482 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:41.550891 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:41.550857 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:41.551005 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:41.550943 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:41.551060 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:41.551009 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:41.551105 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:41.551076 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:43.128471 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:43.128429 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:43.128906 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.128599 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:43.128906 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.128666 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:51.12864724 +0000 UTC m=+18.284197052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:43.229846 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:43.229237 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:43.229846 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.229405 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:43.229846 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.229425 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:43.229846 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.229437 2580 projected.go:194] Error preparing data for projected volume kube-api-access-shr5h for pod openshift-network-diagnostics/network-check-target-zldgh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:43.229846 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.229493 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h podName:737ef4d2-9017-4b74-b25c-6478eda78bb1 nodeName:}" failed. No retries permitted until 2026-04-20 19:20:51.229475139 +0000 UTC m=+18.385024951 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-shr5h" (UniqueName: "kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h") pod "network-check-target-zldgh" (UID: "737ef4d2-9017-4b74-b25c-6478eda78bb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:43.550460 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:43.550364 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:43.550622 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.550505 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:43.552093 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:43.550898 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:43.552093 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.550994 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:43.552093 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:43.551049 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:43.552093 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.551115 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:43.734042 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:43.733956 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:43.734226 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.734134 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:43.734226 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:43.734212 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret podName:1e64ae2b-e7d8-473c-9c0c-c27208349a5c nodeName:}" failed. No retries permitted until 2026-04-20 19:20:51.734192267 +0000 UTC m=+18.889742070 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret") pod "global-pull-secret-syncer-frnjg" (UID: "1e64ae2b-e7d8-473c-9c0c-c27208349a5c") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:45.547777 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:45.547731 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:45.547777 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:45.547778 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:45.548378 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:45.547857 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:45.548378 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:45.547739 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:45.548506 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:45.548470 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:45.548636 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:45.548589 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:47.547952 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:47.547917 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:47.548460 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:47.547917 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:47.548460 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:47.548058 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:47.548460 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:47.547917 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:47.548460 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:47.548129 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:47.548460 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:47.548307 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:49.548176 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:49.548097 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:49.548640 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:49.548227 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:49.548640 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:49.548099 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:49.548640 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:49.548340 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:49.548640 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:49.548101 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:49.548640 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:49.548415 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:51.190339 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:51.190299 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:51.190711 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.190469 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:51.190711 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.190539 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:07.190523178 +0000 UTC m=+34.346072975 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:51.291423 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:51.291390 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:51.291650 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.291513 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:51.291650 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.291528 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:51.291650 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.291537 2580 projected.go:194] Error preparing data for projected volume kube-api-access-shr5h for pod openshift-network-diagnostics/network-check-target-zldgh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:51.291650 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.291595 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h podName:737ef4d2-9017-4b74-b25c-6478eda78bb1 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:07.291576959 +0000 UTC m=+34.447126770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-shr5h" (UniqueName: "kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h") pod "network-check-target-zldgh" (UID: "737ef4d2-9017-4b74-b25c-6478eda78bb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:51.548582 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:51.548501 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:51.548737 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:51.548501 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:51.548737 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.548641 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:51.548737 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:51.548501 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:51.548862 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.548745 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:51.548862 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.548776 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:51.794449 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:51.794412 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:51.794639 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.794537 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:51.794639 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:51.794594 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret podName:1e64ae2b-e7d8-473c-9c0c-c27208349a5c nodeName:}" failed. No retries permitted until 2026-04-20 19:21:07.794580241 +0000 UTC m=+34.950130041 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret") pod "global-pull-secret-syncer-frnjg" (UID: "1e64ae2b-e7d8-473c-9c0c-c27208349a5c") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:20:53.548574 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:53.548543 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:53.549131 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:53.548639 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:53.549131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:53.548726 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:53.549131 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:53.548837 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:53.549131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:53.548888 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:53.549131 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:53.548978 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:54.648308 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.647840 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" event={"ID":"e4b6d1e4-212d-4b20-b151-bd58fb0830d1","Type":"ContainerStarted","Data":"b2eed07a96a66d2fb7c778b6226be642175be5fedcfdf668338acd9c3dcbae79"} Apr 20 19:20:54.649852 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.649815 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4g46x" event={"ID":"9411047e-7c19-48c0-8520-4e2ff4ee80ec","Type":"ContainerStarted","Data":"327b06f4a21f650f63db666b70a1bf5e6877725b8427dddf4b4c23ab173bb63c"} Apr 20 19:20:54.651737 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.651436 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fkcml" event={"ID":"43ca0838-f833-4608-bf4f-d6f498c3c609","Type":"ContainerStarted","Data":"9aca28732b33e5c951b78c9d0447d9c7f658dfaa20f60f35129e974652cb9532"} Apr 20 19:20:54.653128 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.653078 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-78xk2" event={"ID":"ab0118c6-b394-43bc-bf6d-fb41c2beb9d1","Type":"ContainerStarted","Data":"e63851ce51b3fc87a806a985235aa539e234a779034ef4429d324c187b503f12"} Apr 20 19:20:54.654676 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.654640 2580 generic.go:358] "Generic (PLEG): container finished" podID="58118261-be4f-4f34-96ae-d918e3128ec4" containerID="09645c7a627f22423128f0217940518cc9d68f9e44babb051bb780970d805e50" exitCode=0 Apr 20 19:20:54.654767 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.654735 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerDied","Data":"09645c7a627f22423128f0217940518cc9d68f9e44babb051bb780970d805e50"} Apr 20 19:20:54.659550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.658775 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z6jmk" event={"ID":"f817bf27-7a92-45a2-acd6-3f63fdbb765d","Type":"ContainerStarted","Data":"e326e34292df07ff2ac71488648dfea9e666def7d49a31229afdf8b6a850a9db"} Apr 20 19:20:54.661115 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.661079 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bcggv" event={"ID":"cb98fcc4-ee47-45ed-bae3-05703748d0df","Type":"ContainerStarted","Data":"03fd78a393cca280791c1ce4d6805a1bc42b60882a800d5c20ded948da3b05d9"} Apr 20 19:20:54.664056 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.664041 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:20:54.664561 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.664538 2580 generic.go:358] "Generic (PLEG): container finished" podID="9ab31a48-0dcf-4d61-94cd-b05c680c9b49" containerID="50de5f0571bc81055b83aad8a63b77f325371feba8a5e061de959f5a0ed77113" exitCode=1 Apr 20 19:20:54.664661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.664562 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"006dd6b15eeafd244c3499990205a3f7b102db1fc376c5cabd9cbc41f543778a"} Apr 20 19:20:54.664661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.664588 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"345d7decf951de86ad18f1eeb27b788e43d5c30e7762809547db77d9c9e2c778"} Apr 20 19:20:54.664661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.664601 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"9b28e3f522c5fd26879a2f200e5068615e18bb7956a5d61972f734f473a499c4"} Apr 20 19:20:54.664661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.664615 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"037b65760d18b19dc26aab34a6918468188bc5bef7f8bc3757249c0d226438a5"} Apr 20 19:20:54.664661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.664626 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerDied","Data":"50de5f0571bc81055b83aad8a63b77f325371feba8a5e061de959f5a0ed77113"} Apr 20 19:20:54.664661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.664641 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"fe43cbe257c625d4adfcf5d6e9e2df2172aa2ad3a69b210ec431aaa765266fe9"} Apr 20 19:20:54.674569 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.674517 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4g46x" podStartSLOduration=4.46217679 podStartE2EDuration="21.674499519s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:20:36.496504256 +0000 UTC m=+3.652054058" lastFinishedPulling="2026-04-20 19:20:53.708826985 +0000 UTC m=+20.864376787" observedRunningTime="2026-04-20 19:20:54.673705469 +0000 UTC m=+21.829255290" watchObservedRunningTime="2026-04-20 19:20:54.674499519 +0000 UTC m=+21.830049340" Apr 20 19:20:54.675513 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.675475 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-118.ec2.internal" podStartSLOduration=20.675461066 podStartE2EDuration="20.675461066s" podCreationTimestamp="2026-04-20 19:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:20:38.637824479 +0000 UTC m=+5.793374299" watchObservedRunningTime="2026-04-20 19:20:54.675461066 +0000 UTC m=+21.831010887" Apr 20 19:20:54.720854 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.720795 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bcggv" podStartSLOduration=4.49396215 podStartE2EDuration="21.720776209s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:20:36.499028358 +0000 UTC m=+3.654578160" lastFinishedPulling="2026-04-20 19:20:53.725842421 +0000 UTC m=+20.881392219" observedRunningTime="2026-04-20 19:20:54.715521442 +0000 UTC m=+21.871071273" watchObservedRunningTime="2026-04-20 19:20:54.720776209 +0000 UTC m=+21.876326029" Apr 20 19:20:54.756228 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.756181 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-z6jmk" podStartSLOduration=4.551599288 podStartE2EDuration="21.756166652s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:20:36.50185333 +0000 UTC m=+3.657403140" lastFinishedPulling="2026-04-20 19:20:53.706420691 +0000 UTC m=+20.861970504" observedRunningTime="2026-04-20 19:20:54.744049421 +0000 UTC m=+21.899599234" watchObservedRunningTime="2026-04-20 19:20:54.756166652 +0000 UTC m=+21.911716471" Apr 20 19:20:54.771575 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.771526 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-78xk2" podStartSLOduration=2.197920091 podStartE2EDuration="18.771514404s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:20:37.139950779 +0000 UTC m=+4.295500592" lastFinishedPulling="2026-04-20 19:20:53.713545108 +0000 UTC m=+20.869094905" observedRunningTime="2026-04-20 19:20:54.771479939 +0000 UTC m=+21.927029758" watchObservedRunningTime="2026-04-20 19:20:54.771514404 +0000 UTC m=+21.927064223" Apr 20 19:20:54.772034 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:54.772003 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fkcml" podStartSLOduration=4.629246238 podStartE2EDuration="21.771997576s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:20:36.496114655 +0000 UTC m=+3.651664452" lastFinishedPulling="2026-04-20 19:20:53.638865992 +0000 UTC m=+20.794415790" observedRunningTime="2026-04-20 19:20:54.756299301 +0000 UTC m=+21.911849098" watchObservedRunningTime="2026-04-20 19:20:54.771997576 +0000 UTC m=+21.927547401" Apr 20 19:20:55.479432 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.479377 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:20:55.522126 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.521986 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:20:55.479403454Z","UUID":"60d0a1f8-2242-452e-b581-c925c2cf0c2c","Handler":null,"Name":"","Endpoint":""} Apr 20 19:20:55.525206 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.525181 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:20:55.525361 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.525216 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:20:55.548790 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.548749 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:55.548956 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:55.548885 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:55.548956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.548927 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:55.549032 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:55.549001 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:55.549065 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.549042 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:55.549156 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:55.549131 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:55.669299 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.669262 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" event={"ID":"e4b6d1e4-212d-4b20-b151-bd58fb0830d1","Type":"ContainerStarted","Data":"8783c0ed247c2b6aa68c020acf957ca8b8843618306ea527d1f14a22b8885550"} Apr 20 19:20:55.670768 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.670733 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lnx79" event={"ID":"297f9c02-f4d5-419e-916b-2590d0104a7b","Type":"ContainerStarted","Data":"a5873dbf5d18f0ec3d66c01ad6e671dfc303bcc9e063436a2d76f8adfcbabb22"} Apr 20 19:20:55.695114 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:55.695051 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lnx79" podStartSLOduration=5.5105109169999995 podStartE2EDuration="22.695034375s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:20:36.503564436 +0000 UTC m=+3.659114248" lastFinishedPulling="2026-04-20 19:20:53.688087895 +0000 UTC m=+20.843637706" observedRunningTime="2026-04-20 19:20:55.694834908 +0000 UTC m=+22.850384755" watchObservedRunningTime="2026-04-20 19:20:55.695034375 +0000 UTC m=+22.850584194" Apr 20 19:20:57.548012 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.547972 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:57.548631 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.548016 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:57.548631 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.548042 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:57.548631 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:57.548110 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:57.548631 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:57.548274 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:57.548631 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:57.548375 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:57.677760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.677731 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:20:57.678177 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.678147 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"e51a3cfebd5e96483a31bc2747db248988ad3f6a2c2d7ec99169ff3b0c0189c5"} Apr 20 19:20:57.680666 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.680637 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" event={"ID":"e4b6d1e4-212d-4b20-b151-bd58fb0830d1","Type":"ContainerStarted","Data":"22b800bc8d9ccbccadb98052b31281cd3039d8a894efeb38ca0c63ff73549956"} Apr 20 19:20:57.942394 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.942321 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:57.943203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.943182 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:57.973201 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:57.973146 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fpxmn" podStartSLOduration=4.6730878780000005 podStartE2EDuration="24.973129017s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:20:36.504820751 +0000 UTC m=+3.660370556" lastFinishedPulling="2026-04-20 19:20:56.804861884 +0000 UTC m=+23.960411695" observedRunningTime="2026-04-20 19:20:57.705105104 +0000 UTC m=+24.860654925" watchObservedRunningTime="2026-04-20 19:20:57.973129017 +0000 UTC m=+25.128678840" Apr 20 19:20:58.682845 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:58.682803 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:58.683309 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:58.683287 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-z6jmk" Apr 20 19:20:59.548491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.548263 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:20:59.548624 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.548265 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:20:59.548624 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:59.548546 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:20:59.548624 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.548277 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:20:59.548624 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:59.548614 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:20:59.548806 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:20:59.548735 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:20:59.686064 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.686024 2580 generic.go:358] "Generic (PLEG): container finished" podID="58118261-be4f-4f34-96ae-d918e3128ec4" containerID="ef64915a9b4e5a9eecd348dc3ba0eb43f6e4f0aaafadcd46a7cfea589ad52067" exitCode=0 Apr 20 19:20:59.686521 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.686106 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerDied","Data":"ef64915a9b4e5a9eecd348dc3ba0eb43f6e4f0aaafadcd46a7cfea589ad52067"} Apr 20 19:20:59.689285 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.689265 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:20:59.689651 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.689612 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"2af033eeecf6465089a623e24514b3ddd0d2735bdd8ee50c0d897d52bd1f4e9b"} Apr 20 19:20:59.690118 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.690075 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:59.690118 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.690102 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:20:59.690297 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.690152 2580 scope.go:117] "RemoveContainer" containerID="50de5f0571bc81055b83aad8a63b77f325371feba8a5e061de959f5a0ed77113" Apr 20 19:20:59.705599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:20:59.705580 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:21:00.694723 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:00.694563 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:21:00.695084 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:00.695063 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" event={"ID":"9ab31a48-0dcf-4d61-94cd-b05c680c9b49","Type":"ContainerStarted","Data":"dfcf68082d814e26dc6c665fd7b080e64c2ce6fc2007aa8a320cfacbabf085de"} Apr 20 19:21:00.695269 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:00.695230 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:21:00.710772 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:00.710193 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:21:00.722723 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:00.722684 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" podStartSLOduration=10.24732151 podStartE2EDuration="27.722672962s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:20:36.493683245 +0000 UTC m=+3.649233045" lastFinishedPulling="2026-04-20 19:20:53.969034681 +0000 UTC m=+21.124584497" observedRunningTime="2026-04-20 19:21:00.722275765 +0000 UTC m=+27.877825583" watchObservedRunningTime="2026-04-20 19:21:00.722672962 +0000 UTC m=+27.878222781" Apr 20 19:21:01.085706 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:01.085628 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-frnjg"] Apr 20 19:21:01.085855 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:01.085782 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:01.085924 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:01.085896 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:21:01.089574 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:01.089544 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mw5qh"] Apr 20 19:21:01.089721 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:01.089657 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:21:01.089786 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:01.089749 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:21:01.090309 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:01.090280 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zldgh"] Apr 20 19:21:01.090426 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:01.090396 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:01.090490 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:01.090472 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:21:01.698614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:01.698578 2580 generic.go:358] "Generic (PLEG): container finished" podID="58118261-be4f-4f34-96ae-d918e3128ec4" containerID="3416898f07f94bc6bf44759e51300576059146fe6f37fecc31533df3a2f5e810" exitCode=0 Apr 20 19:21:01.699052 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:01.698668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerDied","Data":"3416898f07f94bc6bf44759e51300576059146fe6f37fecc31533df3a2f5e810"} Apr 20 19:21:02.548297 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:02.548264 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:02.548297 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:02.548304 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:02.548482 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:02.548381 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:21:02.548549 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:02.548506 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:21:03.549176 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:03.549142 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:21:03.549680 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:03.549235 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:21:03.704684 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:03.704655 2580 generic.go:358] "Generic (PLEG): container finished" podID="58118261-be4f-4f34-96ae-d918e3128ec4" containerID="4705dbea539d87bbc3e5901369cc4fc6edeba3d8d70ad016694ce13484caf5fc" exitCode=0 Apr 20 19:21:03.704809 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:03.704721 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerDied","Data":"4705dbea539d87bbc3e5901369cc4fc6edeba3d8d70ad016694ce13484caf5fc"} Apr 20 19:21:04.548785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:04.548556 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:04.548949 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:04.548564 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:04.548949 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:04.548875 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:21:04.549051 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:04.548947 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:21:05.547791 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:05.547759 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:21:05.548201 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:05.547901 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:21:06.547863 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:06.547818 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:06.548326 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:06.547853 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:06.548326 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:06.547965 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-frnjg" podUID="1e64ae2b-e7d8-473c-9c0c-c27208349a5c" Apr 20 19:21:06.548326 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:06.548085 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zldgh" podUID="737ef4d2-9017-4b74-b25c-6478eda78bb1" Apr 20 19:21:07.209714 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.209671 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:21:07.210001 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.209870 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:21:07.210001 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.209959 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:39.209937964 +0000 UTC m=+66.365487772 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:21:07.310028 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.309987 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:07.310224 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.310202 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:21:07.310295 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.310229 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:21:07.310295 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.310240 2580 projected.go:194] Error preparing data for projected volume kube-api-access-shr5h for pod openshift-network-diagnostics/network-check-target-zldgh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:21:07.310358 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.310313 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h podName:737ef4d2-9017-4b74-b25c-6478eda78bb1 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:39.31029762 +0000 UTC m=+66.465847422 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-shr5h" (UniqueName: "kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h") pod "network-check-target-zldgh" (UID: "737ef4d2-9017-4b74-b25c-6478eda78bb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:21:07.548819 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.548789 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:21:07.549293 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.548905 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:21:07.633677 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.633647 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-118.ec2.internal" event="NodeReady" Apr 20 19:21:07.633849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.633800 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:21:07.669027 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.668997 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5755568fd6-9nlq5"] Apr 20 19:21:07.672024 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.671270 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.672024 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.671539 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-76ngm"] Apr 20 19:21:07.673887 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.673627 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 19:21:07.673887 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.673651 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 19:21:07.673887 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.673627 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mf28l\"" Apr 20 19:21:07.673887 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.673804 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:07.674134 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.673958 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 19:21:07.675734 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.675713 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-wvwpr\"" Apr 20 19:21:07.676125 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.675953 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 19:21:07.679080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.678908 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 19:21:07.679890 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.679870 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 19:21:07.681929 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.681903 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8q97z"] Apr 20 19:21:07.683798 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.683780 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:07.685179 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.685159 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-h2wph"] Apr 20 19:21:07.686490 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.686471 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:21:07.686865 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.686792 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5755568fd6-9nlq5"] Apr 20 19:21:07.686865 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.686816 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-76ngm"] Apr 20 19:21:07.686996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.686924 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.687327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.687309 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:21:07.687422 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.687324 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ltv6t\"" Apr 20 19:21:07.687422 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.687348 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:21:07.690896 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.690880 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:21:07.691083 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.690906 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qtv7w\"" Apr 20 19:21:07.691196 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.690945 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:21:07.694055 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.694035 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8q97z"] Apr 20 19:21:07.700662 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.700638 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h2wph"] Apr 20 19:21:07.813441 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813358 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-image-registry-private-configuration\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.813441 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.813441 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813430 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-installation-pull-secrets\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.813678 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813492 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-bound-sa-token\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.813678 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813533 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/593cc9e9-b499-4686-897a-e1b604685e20-tmp-dir\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.813678 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813566 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlxdl\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-kube-api-access-zlxdl\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.813678 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813592 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvlj\" (UniqueName: \"kubernetes.io/projected/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-kube-api-access-9vvlj\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:07.813678 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813619 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/593cc9e9-b499-4686-897a-e1b604685e20-config-volume\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.813902 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813700 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3fa1e505-222b-4d26-b6c6-b500bff9d597-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:07.813902 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813742 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:07.813902 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813768 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-ca-trust-extracted\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.813902 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813805 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4qsp\" (UniqueName: \"kubernetes.io/projected/593cc9e9-b499-4686-897a-e1b604685e20-kube-api-access-l4qsp\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.813902 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813835 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:07.813902 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-trusted-ca\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.814106 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813925 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:07.814106 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813955 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-certificates\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.814106 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.813982 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.814106 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.814065 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:21:07.814269 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.814130 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret podName:1e64ae2b-e7d8-473c-9c0c-c27208349a5c nodeName:}" failed. No retries permitted until 2026-04-20 19:21:39.814112193 +0000 UTC m=+66.969661993 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret") pod "global-pull-secret-syncer-frnjg" (UID: "1e64ae2b-e7d8-473c-9c0c-c27208349a5c") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:21:07.914683 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.914639 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-certificates\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.914875 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.914710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.914875 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.914841 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:07.914982 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.914909 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls podName:593cc9e9-b499-4686-897a-e1b604685e20 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:08.414884802 +0000 UTC m=+35.570434612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls") pod "dns-default-h2wph" (UID: "593cc9e9-b499-4686-897a-e1b604685e20") : secret "dns-default-metrics-tls" not found Apr 20 19:21:07.915891 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.915863 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-certificates\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.916082 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916062 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-image-registry-private-configuration\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.916167 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916108 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.916167 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916155 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-installation-pull-secrets\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.916286 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916184 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-bound-sa-token\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.916286 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916219 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/593cc9e9-b499-4686-897a-e1b604685e20-tmp-dir\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.916286 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916276 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlxdl\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-kube-api-access-zlxdl\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.916478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916314 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvlj\" (UniqueName: \"kubernetes.io/projected/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-kube-api-access-9vvlj\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:07.916478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916349 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/593cc9e9-b499-4686-897a-e1b604685e20-config-volume\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.916478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916400 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3fa1e505-222b-4d26-b6c6-b500bff9d597-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:07.916478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916436 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:07.916478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916471 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-ca-trust-extracted\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.916693 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916516 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4qsp\" (UniqueName: \"kubernetes.io/projected/593cc9e9-b499-4686-897a-e1b604685e20-kube-api-access-l4qsp\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.916693 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916558 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:07.916693 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.916616 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-trusted-ca\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.918057 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.918028 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-trusted-ca\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.918599 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.918571 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:21:07.918691 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.918602 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5755568fd6-9nlq5: secret "image-registry-tls" not found Apr 20 19:21:07.918691 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.918662 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls podName:3d02caf8-5ad7-4d7c-aad3-54babb0bd46b nodeName:}" failed. No retries permitted until 2026-04-20 19:21:08.418645014 +0000 UTC m=+35.574194824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls") pod "image-registry-5755568fd6-9nlq5" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b") : secret "image-registry-tls" not found Apr 20 19:21:07.918920 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.918895 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:21:07.919001 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.918979 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert podName:3fa1e505-222b-4d26-b6c6-b500bff9d597 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:08.418963164 +0000 UTC m=+35.574512964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-76ngm" (UID: "3fa1e505-222b-4d26-b6c6-b500bff9d597") : secret "networking-console-plugin-cert" not found Apr 20 19:21:07.919110 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.919074 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-ca-trust-extracted\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.919235 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.919206 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:07.919317 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.919282 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/593cc9e9-b499-4686-897a-e1b604685e20-tmp-dir\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.919317 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:07.919309 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert podName:b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:08.419287771 +0000 UTC m=+35.574837567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert") pod "ingress-canary-8q97z" (UID: "b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2") : secret "canary-serving-cert" not found Apr 20 19:21:07.919412 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.919350 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/593cc9e9-b499-4686-897a-e1b604685e20-config-volume\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.919412 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.919376 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3fa1e505-222b-4d26-b6c6-b500bff9d597-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:07.923193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.923168 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-installation-pull-secrets\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.923319 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.923190 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-image-registry-private-configuration\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.929879 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.929850 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvlj\" (UniqueName: \"kubernetes.io/projected/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-kube-api-access-9vvlj\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:07.930004 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.929907 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4qsp\" (UniqueName: \"kubernetes.io/projected/593cc9e9-b499-4686-897a-e1b604685e20-kube-api-access-l4qsp\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:07.930085 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.930066 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-bound-sa-token\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:07.930228 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:07.930207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlxdl\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-kube-api-access-zlxdl\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:08.420352 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.420314 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:08.420565 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.420382 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:08.420565 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.420414 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:08.420565 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.420459 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:08.420565 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420480 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:21:08.420565 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420519 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5755568fd6-9nlq5: secret "image-registry-tls" not found Apr 20 19:21:08.420565 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420531 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:21:08.420834 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420575 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:08.420834 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420597 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls podName:3d02caf8-5ad7-4d7c-aad3-54babb0bd46b nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.42057721 +0000 UTC m=+36.576127046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls") pod "image-registry-5755568fd6-9nlq5" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b") : secret "image-registry-tls" not found Apr 20 19:21:08.420834 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420616 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert podName:3fa1e505-222b-4d26-b6c6-b500bff9d597 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.420606761 +0000 UTC m=+36.576156563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-76ngm" (UID: "3fa1e505-222b-4d26-b6c6-b500bff9d597") : secret "networking-console-plugin-cert" not found Apr 20 19:21:08.420834 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420632 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls podName:593cc9e9-b499-4686-897a-e1b604685e20 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.420623851 +0000 UTC m=+36.576173671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls") pod "dns-default-h2wph" (UID: "593cc9e9-b499-4686-897a-e1b604685e20") : secret "dns-default-metrics-tls" not found Apr 20 19:21:08.420834 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420643 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:08.420834 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:08.420684 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert podName:b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:09.420675225 +0000 UTC m=+36.576225023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert") pod "ingress-canary-8q97z" (UID: "b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2") : secret "canary-serving-cert" not found Apr 20 19:21:08.548230 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.548189 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:08.548435 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.548189 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:08.550859 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.550835 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:21:08.550859 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.550851 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:21:08.551683 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.551663 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-67k9h\"" Apr 20 19:21:08.551796 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:08.551685 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:21:09.429724 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:09.429679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:09.429752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:09.429790 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:09.429841 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.429847 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.429871 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5755568fd6-9nlq5: secret "image-registry-tls" not found Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.429878 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.429924 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.429930 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert podName:3fa1e505-222b-4d26-b6c6-b500bff9d597 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:11.429911889 +0000 UTC m=+38.585461717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-76ngm" (UID: "3fa1e505-222b-4d26-b6c6-b500bff9d597") : secret "networking-console-plugin-cert" not found Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.429927 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:09.429942 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.429950 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls podName:3d02caf8-5ad7-4d7c-aad3-54babb0bd46b nodeName:}" failed. No retries permitted until 2026-04-20 19:21:11.42994194 +0000 UTC m=+38.585491737 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls") pod "image-registry-5755568fd6-9nlq5" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b") : secret "image-registry-tls" not found Apr 20 19:21:09.430413 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.429985 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert podName:b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:11.42996846 +0000 UTC m=+38.585518280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert") pod "ingress-canary-8q97z" (UID: "b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2") : secret "canary-serving-cert" not found Apr 20 19:21:09.430413 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:09.430002 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls podName:593cc9e9-b499-4686-897a-e1b604685e20 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:11.42999427 +0000 UTC m=+38.585544067 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls") pod "dns-default-h2wph" (UID: "593cc9e9-b499-4686-897a-e1b604685e20") : secret "dns-default-metrics-tls" not found Apr 20 19:21:09.548612 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:09.548581 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:21:09.550944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:09.550921 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:21:09.551337 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:09.551030 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gxr6p\"" Apr 20 19:21:09.719099 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:09.719011 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerStarted","Data":"0b255d1987327f6d06f250c997bc59cf65e78c94ba023f957db9a3266cf84500"} Apr 20 19:21:10.722464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:10.722435 2580 generic.go:358] "Generic (PLEG): container finished" podID="58118261-be4f-4f34-96ae-d918e3128ec4" containerID="0b255d1987327f6d06f250c997bc59cf65e78c94ba023f957db9a3266cf84500" exitCode=0 Apr 20 19:21:10.722779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:10.722489 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerDied","Data":"0b255d1987327f6d06f250c997bc59cf65e78c94ba023f957db9a3266cf84500"} Apr 20 19:21:11.446193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:11.446152 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:11.446193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:11.446213 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:11.446430 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446322 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:21:11.446430 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446335 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5755568fd6-9nlq5: secret "image-registry-tls" not found Apr 20 19:21:11.446430 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446335 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:11.446430 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:11.446358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:11.446430 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446388 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls podName:3d02caf8-5ad7-4d7c-aad3-54babb0bd46b nodeName:}" failed. No retries permitted until 2026-04-20 19:21:15.446371937 +0000 UTC m=+42.601921755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls") pod "image-registry-5755568fd6-9nlq5" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b") : secret "image-registry-tls" not found Apr 20 19:21:11.446430 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446414 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls podName:593cc9e9-b499-4686-897a-e1b604685e20 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:15.446401004 +0000 UTC m=+42.601950801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls") pod "dns-default-h2wph" (UID: "593cc9e9-b499-4686-897a-e1b604685e20") : secret "dns-default-metrics-tls" not found Apr 20 19:21:11.446617 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446432 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:21:11.446617 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:11.446444 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:11.446617 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446536 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:11.446617 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446564 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert podName:3fa1e505-222b-4d26-b6c6-b500bff9d597 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:15.446550432 +0000 UTC m=+42.602100252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-76ngm" (UID: "3fa1e505-222b-4d26-b6c6-b500bff9d597") : secret "networking-console-plugin-cert" not found Apr 20 19:21:11.446617 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:11.446585 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert podName:b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:15.4465722 +0000 UTC m=+42.602122000 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert") pod "ingress-canary-8q97z" (UID: "b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2") : secret "canary-serving-cert" not found Apr 20 19:21:11.727410 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:11.727320 2580 generic.go:358] "Generic (PLEG): container finished" podID="58118261-be4f-4f34-96ae-d918e3128ec4" containerID="3e0c64b9ab15457f6a648c3cb4dcac78503b99fef9e301137964f6132c96aca3" exitCode=0 Apr 20 19:21:11.727410 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:11.727383 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerDied","Data":"3e0c64b9ab15457f6a648c3cb4dcac78503b99fef9e301137964f6132c96aca3"} Apr 20 19:21:12.732674 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:12.732633 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-l87ws" event={"ID":"58118261-be4f-4f34-96ae-d918e3128ec4","Type":"ContainerStarted","Data":"0447d7f235048261ca00e969bb988851a16ca7bfab99f525be4c230aafbc76ce"} Apr 20 19:21:12.756368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:12.756299 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-l87ws" podStartSLOduration=6.742476334 podStartE2EDuration="39.756283758s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:20:36.500588418 +0000 UTC m=+3.656138228" lastFinishedPulling="2026-04-20 19:21:09.514395847 +0000 UTC m=+36.669945652" observedRunningTime="2026-04-20 19:21:12.754923199 +0000 UTC m=+39.910473023" watchObservedRunningTime="2026-04-20 19:21:12.756283758 +0000 UTC m=+39.911833577" Apr 20 19:21:15.480737 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:15.480686 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:15.480758 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:15.480798 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.480824 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:15.480844 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.480883 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert podName:b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:23.480867989 +0000 UTC m=+50.636417812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert") pod "ingress-canary-8q97z" (UID: "b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2") : secret "canary-serving-cert" not found Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.480907 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.480929 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.480932 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.480952 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5755568fd6-9nlq5: secret "image-registry-tls" not found Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.480961 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls podName:593cc9e9-b499-4686-897a-e1b604685e20 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:23.480946188 +0000 UTC m=+50.636496007 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls") pod "dns-default-h2wph" (UID: "593cc9e9-b499-4686-897a-e1b604685e20") : secret "dns-default-metrics-tls" not found Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.480977 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert podName:3fa1e505-222b-4d26-b6c6-b500bff9d597 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:23.480970754 +0000 UTC m=+50.636520550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-76ngm" (UID: "3fa1e505-222b-4d26-b6c6-b500bff9d597") : secret "networking-console-plugin-cert" not found Apr 20 19:21:15.481183 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:15.481016 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls podName:3d02caf8-5ad7-4d7c-aad3-54babb0bd46b nodeName:}" failed. No retries permitted until 2026-04-20 19:21:23.480999436 +0000 UTC m=+50.636549234 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls") pod "image-registry-5755568fd6-9nlq5" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b") : secret "image-registry-tls" not found Apr 20 19:21:23.544705 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:23.544659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:23.544714 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:23.544766 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544812 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:23.544819 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544876 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544912 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544924 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5755568fd6-9nlq5: secret "image-registry-tls" not found Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544891 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert podName:3fa1e505-222b-4d26-b6c6-b500bff9d597 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:39.544871704 +0000 UTC m=+66.700421521 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-76ngm" (UID: "3fa1e505-222b-4d26-b6c6-b500bff9d597") : secret "networking-console-plugin-cert" not found Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544941 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544956 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert podName:b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:39.544938913 +0000 UTC m=+66.700488710 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert") pod "ingress-canary-8q97z" (UID: "b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2") : secret "canary-serving-cert" not found Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544975 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls podName:3d02caf8-5ad7-4d7c-aad3-54babb0bd46b nodeName:}" failed. No retries permitted until 2026-04-20 19:21:39.544965085 +0000 UTC m=+66.700514885 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls") pod "image-registry-5755568fd6-9nlq5" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b") : secret "image-registry-tls" not found Apr 20 19:21:23.545246 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:23.544992 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls podName:593cc9e9-b499-4686-897a-e1b604685e20 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:39.544980213 +0000 UTC m=+66.700530014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls") pod "dns-default-h2wph" (UID: "593cc9e9-b499-4686-897a-e1b604685e20") : secret "dns-default-metrics-tls" not found Apr 20 19:21:32.711903 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:32.711874 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d9tnf" Apr 20 19:21:39.258510 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.258472 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:21:39.261103 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.261082 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:21:39.269121 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.269105 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:21:39.269181 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.269164 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:43.269150153 +0000 UTC m=+130.424699954 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : secret "metrics-daemon-secret" not found Apr 20 19:21:39.359723 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.359680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:39.362328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.362311 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:21:39.372440 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.372420 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:21:39.384560 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.384532 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shr5h\" (UniqueName: \"kubernetes.io/projected/737ef4d2-9017-4b74-b25c-6478eda78bb1-kube-api-access-shr5h\") pod \"network-check-target-zldgh\" (UID: \"737ef4d2-9017-4b74-b25c-6478eda78bb1\") " pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:39.463243 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.463210 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-67k9h\"" Apr 20 19:21:39.471774 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.471754 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.565593 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.565653 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.565687 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.565737 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.565859 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.565918 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls podName:593cc9e9-b499-4686-897a-e1b604685e20 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:11.565900899 +0000 UTC m=+98.721450713 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls") pod "dns-default-h2wph" (UID: "593cc9e9-b499-4686-897a-e1b604685e20") : secret "dns-default-metrics-tls" not found Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.566380 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.566395 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5755568fd6-9nlq5: secret "image-registry-tls" not found Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.566439 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls podName:3d02caf8-5ad7-4d7c-aad3-54babb0bd46b nodeName:}" failed. No retries permitted until 2026-04-20 19:22:11.566424792 +0000 UTC m=+98.721974604 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls") pod "image-registry-5755568fd6-9nlq5" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b") : secret "image-registry-tls" not found Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.566498 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.566545 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert podName:3fa1e505-222b-4d26-b6c6-b500bff9d597 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:11.566521887 +0000 UTC m=+98.722071698 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-76ngm" (UID: "3fa1e505-222b-4d26-b6c6-b500bff9d597") : secret "networking-console-plugin-cert" not found Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.566597 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:39.568387 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:21:39.566623 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert podName:b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:11.566614105 +0000 UTC m=+98.722163916 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert") pod "ingress-canary-8q97z" (UID: "b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2") : secret "canary-serving-cert" not found Apr 20 19:21:39.621998 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.621964 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zldgh"] Apr 20 19:21:39.625458 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:21:39.625431 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod737ef4d2_9017_4b74_b25c_6478eda78bb1.slice/crio-98100a66f858a8bff21c892177136ee09e9f127783a0071ae86cbf097f8d12df WatchSource:0}: Error finding container 98100a66f858a8bff21c892177136ee09e9f127783a0071ae86cbf097f8d12df: Status 404 returned error can't find the container with id 98100a66f858a8bff21c892177136ee09e9f127783a0071ae86cbf097f8d12df Apr 20 19:21:39.786782 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.786691 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zldgh" event={"ID":"737ef4d2-9017-4b74-b25c-6478eda78bb1","Type":"ContainerStarted","Data":"98100a66f858a8bff21c892177136ee09e9f127783a0071ae86cbf097f8d12df"} Apr 20 19:21:39.868842 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.868803 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:39.871796 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.871777 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:21:39.882031 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:39.882008 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e64ae2b-e7d8-473c-9c0c-c27208349a5c-original-pull-secret\") pod \"global-pull-secret-syncer-frnjg\" (UID: \"1e64ae2b-e7d8-473c-9c0c-c27208349a5c\") " pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:40.067064 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:40.066977 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-frnjg" Apr 20 19:21:40.195416 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:40.195382 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-frnjg"] Apr 20 19:21:40.198955 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:21:40.198928 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e64ae2b_e7d8_473c_9c0c_c27208349a5c.slice/crio-3e89b683113cc70e3d77552d4eddedeb8732c152a451e21312d068b71dc39144 WatchSource:0}: Error finding container 3e89b683113cc70e3d77552d4eddedeb8732c152a451e21312d068b71dc39144: Status 404 returned error can't find the container with id 3e89b683113cc70e3d77552d4eddedeb8732c152a451e21312d068b71dc39144 Apr 20 19:21:40.789827 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:40.789780 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-frnjg" event={"ID":"1e64ae2b-e7d8-473c-9c0c-c27208349a5c","Type":"ContainerStarted","Data":"3e89b683113cc70e3d77552d4eddedeb8732c152a451e21312d068b71dc39144"} Apr 20 19:21:42.794939 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:42.794902 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zldgh" event={"ID":"737ef4d2-9017-4b74-b25c-6478eda78bb1","Type":"ContainerStarted","Data":"71e220ca8bb5467bd5df70dcb4623e91cfd3fff45a15532e396fa837f7b98dbb"} Apr 20 19:21:42.795338 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:42.795024 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:21:42.812632 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:42.812589 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zldgh" podStartSLOduration=66.991231576 podStartE2EDuration="1m9.812573069s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:21:39.627173614 +0000 UTC m=+66.782723411" lastFinishedPulling="2026-04-20 19:21:42.448515107 +0000 UTC m=+69.604064904" observedRunningTime="2026-04-20 19:21:42.812072516 +0000 UTC m=+69.967622336" watchObservedRunningTime="2026-04-20 19:21:42.812573069 +0000 UTC m=+69.968122887" Apr 20 19:21:45.803058 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:45.803016 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-frnjg" event={"ID":"1e64ae2b-e7d8-473c-9c0c-c27208349a5c","Type":"ContainerStarted","Data":"b0162572dc3209e7531175595ae39f4d9ab87c32e744f254d4141dacd3c34462"} Apr 20 19:21:45.818596 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:21:45.818551 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-frnjg" podStartSLOduration=65.071845881 podStartE2EDuration="1m9.818536339s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:21:40.2013375 +0000 UTC m=+67.356887309" lastFinishedPulling="2026-04-20 19:21:44.948027969 +0000 UTC m=+72.103577767" observedRunningTime="2026-04-20 19:21:45.817936628 +0000 UTC m=+72.973486446" watchObservedRunningTime="2026-04-20 19:21:45.818536339 +0000 UTC m=+72.974086157" Apr 20 19:22:11.606149 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:11.606114 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:11.606161 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:11.606192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:11.606219 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606266 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606305 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606329 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert podName:b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2 nodeName:}" failed. No retries permitted until 2026-04-20 19:23:15.606314204 +0000 UTC m=+162.761864001 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert") pod "ingress-canary-8q97z" (UID: "b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2") : secret "canary-serving-cert" not found Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606335 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606353 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert podName:3fa1e505-222b-4d26-b6c6-b500bff9d597 nodeName:}" failed. No retries permitted until 2026-04-20 19:23:15.606341882 +0000 UTC m=+162.761891680 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-76ngm" (UID: "3fa1e505-222b-4d26-b6c6-b500bff9d597") : secret "networking-console-plugin-cert" not found Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606350 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606374 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5755568fd6-9nlq5: secret "image-registry-tls" not found Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606395 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls podName:593cc9e9-b499-4686-897a-e1b604685e20 nodeName:}" failed. No retries permitted until 2026-04-20 19:23:15.60637714 +0000 UTC m=+162.761926947 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls") pod "dns-default-h2wph" (UID: "593cc9e9-b499-4686-897a-e1b604685e20") : secret "dns-default-metrics-tls" not found Apr 20 19:22:11.606644 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:11.606417 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls podName:3d02caf8-5ad7-4d7c-aad3-54babb0bd46b nodeName:}" failed. No retries permitted until 2026-04-20 19:23:15.606403214 +0000 UTC m=+162.761953025 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls") pod "image-registry-5755568fd6-9nlq5" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b") : secret "image-registry-tls" not found Apr 20 19:22:13.800181 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:13.800148 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zldgh" Apr 20 19:22:42.471978 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.471938 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-76f79b8cc6-4kmk9"] Apr 20 19:22:42.475656 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.475635 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.477901 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.477877 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 19:22:42.478039 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.477877 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 19:22:42.478039 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.477904 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 19:22:42.478039 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.478022 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-2g268\"" Apr 20 19:22:42.478207 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.478089 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:22:42.478383 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.478368 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 19:22:42.478595 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.478580 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:22:42.487394 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.487374 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-76f79b8cc6-4kmk9"] Apr 20 19:22:42.529537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.529498 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.529537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.529537 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-default-certificate\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.529785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.529573 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.529785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.529635 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-stats-auth\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.529785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.529745 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5chd\" (UniqueName: \"kubernetes.io/projected/b2caeaea-f388-4a16-a139-404c07f66f1e-kube-api-access-s5chd\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.630466 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.630431 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-stats-auth\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.630645 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.630485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5chd\" (UniqueName: \"kubernetes.io/projected/b2caeaea-f388-4a16-a139-404c07f66f1e-kube-api-access-s5chd\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.630645 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.630527 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.630645 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.630544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-default-certificate\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.630645 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.630571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.630645 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:42.630643 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:22:42.630868 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:42.630690 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:43.130671911 +0000 UTC m=+130.286221714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : configmap references non-existent config key: service-ca.crt Apr 20 19:22:42.630868 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:42.630715 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:43.130707288 +0000 UTC m=+130.286257088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : secret "router-metrics-certs-default" not found Apr 20 19:22:42.633437 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.633415 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-default-certificate\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.633552 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.633534 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-stats-auth\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:42.639207 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:42.639174 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5chd\" (UniqueName: \"kubernetes.io/projected/b2caeaea-f388-4a16-a139-404c07f66f1e-kube-api-access-s5chd\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:43.133795 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:43.133750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:43.134000 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:43.133814 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:43.134000 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:43.133918 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:22:43.134000 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:43.133946 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:44.133923805 +0000 UTC m=+131.289473606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : configmap references non-existent config key: service-ca.crt Apr 20 19:22:43.134000 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:43.133998 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:44.133981943 +0000 UTC m=+131.289531740 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : secret "router-metrics-certs-default" not found Apr 20 19:22:43.335961 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:43.335929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:22:43.336148 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:43.336070 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:22:43.336148 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:43.336135 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs podName:a8ada6b3-5038-4d1c-bbe5-a9626c8c1987 nodeName:}" failed. No retries permitted until 2026-04-20 19:24:45.336119912 +0000 UTC m=+252.491669708 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs") pod "network-metrics-daemon-mw5qh" (UID: "a8ada6b3-5038-4d1c-bbe5-a9626c8c1987") : secret "metrics-daemon-secret" not found Apr 20 19:22:44.141204 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:44.141157 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:44.141204 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:44.141214 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:44.141650 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:44.141334 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:22:44.141650 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:44.141361 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:46.141343419 +0000 UTC m=+133.296893216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : configmap references non-existent config key: service-ca.crt Apr 20 19:22:44.141650 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:44.141385 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:46.141377646 +0000 UTC m=+133.296927443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : secret "router-metrics-certs-default" not found Apr 20 19:22:46.158042 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:46.157993 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:46.158464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:46.158133 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:46.158464 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:46.158150 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:22:46.158464 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:46.158229 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:50.158211809 +0000 UTC m=+137.313761625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : secret "router-metrics-certs-default" not found Apr 20 19:22:46.158464 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:46.158264 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:50.158239747 +0000 UTC m=+137.313789545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : configmap references non-existent config key: service-ca.crt Apr 20 19:22:48.786388 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:48.786363 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-78xk2_ab0118c6-b394-43bc-bf6d-fb41c2beb9d1/dns-node-resolver/0.log" Apr 20 19:22:49.586479 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:49.586453 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fkcml_43ca0838-f833-4608-bf4f-d6f498c3c609/node-ca/0.log" Apr 20 19:22:50.185689 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:50.185654 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:50.186127 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:50.185775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:50.186127 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:50.185808 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:22:50.186127 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:50.185883 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:58.185865338 +0000 UTC m=+145.341415139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : secret "router-metrics-certs-default" not found Apr 20 19:22:50.186127 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:50.185933 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:22:58.185916034 +0000 UTC m=+145.341465847 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : configmap references non-existent config key: service-ca.crt Apr 20 19:22:52.547704 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.547662 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk"] Apr 20 19:22:52.550559 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.550541 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk" Apr 20 19:22:52.552813 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.552790 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:22:52.552931 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.552913 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 19:22:52.553555 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.553542 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-npdk5\"" Apr 20 19:22:52.559393 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.559373 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk"] Apr 20 19:22:52.604668 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.604629 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv5s8\" (UniqueName: \"kubernetes.io/projected/92b63e56-5288-415d-a75a-4c18d45ecf72-kube-api-access-lv5s8\") pod \"volume-data-source-validator-7c6cbb6c87-zb2mk\" (UID: \"92b63e56-5288-415d-a75a-4c18d45ecf72\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk" Apr 20 19:22:52.653959 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.653922 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xl6pd"] Apr 20 19:22:52.656752 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.656737 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.659108 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.659081 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 19:22:52.659228 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.659132 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:22:52.659228 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.659132 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 19:22:52.659228 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.659142 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-mh9lz\"" Apr 20 19:22:52.659679 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.659663 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 19:22:52.665969 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.665943 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 19:22:52.672428 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.672409 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xl6pd"] Apr 20 19:22:52.705215 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.705185 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c94687d9-038f-401c-95c1-1a65b578340b-serving-cert\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.705412 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.705240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv5s8\" (UniqueName: \"kubernetes.io/projected/92b63e56-5288-415d-a75a-4c18d45ecf72-kube-api-access-lv5s8\") pod \"volume-data-source-validator-7c6cbb6c87-zb2mk\" (UID: \"92b63e56-5288-415d-a75a-4c18d45ecf72\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk" Apr 20 19:22:52.705412 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.705323 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h24zf\" (UniqueName: \"kubernetes.io/projected/c94687d9-038f-401c-95c1-1a65b578340b-kube-api-access-h24zf\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.705412 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.705362 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c94687d9-038f-401c-95c1-1a65b578340b-config\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.705412 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.705403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c94687d9-038f-401c-95c1-1a65b578340b-trusted-ca\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.713732 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.713705 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv5s8\" (UniqueName: \"kubernetes.io/projected/92b63e56-5288-415d-a75a-4c18d45ecf72-kube-api-access-lv5s8\") pod \"volume-data-source-validator-7c6cbb6c87-zb2mk\" (UID: \"92b63e56-5288-415d-a75a-4c18d45ecf72\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk" Apr 20 19:22:52.805999 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.805921 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c94687d9-038f-401c-95c1-1a65b578340b-serving-cert\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.805999 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.805976 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h24zf\" (UniqueName: \"kubernetes.io/projected/c94687d9-038f-401c-95c1-1a65b578340b-kube-api-access-h24zf\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.806170 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.806112 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c94687d9-038f-401c-95c1-1a65b578340b-config\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.806207 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.806184 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c94687d9-038f-401c-95c1-1a65b578340b-trusted-ca\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.807237 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.807220 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c94687d9-038f-401c-95c1-1a65b578340b-config\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.807922 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.807904 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c94687d9-038f-401c-95c1-1a65b578340b-trusted-ca\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.808894 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.808875 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c94687d9-038f-401c-95c1-1a65b578340b-serving-cert\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.813949 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.813927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h24zf\" (UniqueName: \"kubernetes.io/projected/c94687d9-038f-401c-95c1-1a65b578340b-kube-api-access-h24zf\") pod \"console-operator-9d4b6777b-xl6pd\" (UID: \"c94687d9-038f-401c-95c1-1a65b578340b\") " pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.859973 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.859923 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk" Apr 20 19:22:52.966687 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.966656 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:22:52.976715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:52.976686 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk"] Apr 20 19:22:52.979783 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:22:52.979753 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92b63e56_5288_415d_a75a_4c18d45ecf72.slice/crio-49d89c510a8b9c56c5d072387515144f804d1bf0a0991b21b9b19797e40d4df9 WatchSource:0}: Error finding container 49d89c510a8b9c56c5d072387515144f804d1bf0a0991b21b9b19797e40d4df9: Status 404 returned error can't find the container with id 49d89c510a8b9c56c5d072387515144f804d1bf0a0991b21b9b19797e40d4df9 Apr 20 19:22:53.093482 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:53.093405 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xl6pd"] Apr 20 19:22:53.096584 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:22:53.096534 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc94687d9_038f_401c_95c1_1a65b578340b.slice/crio-b7455ef306e3f6664553198a9c172de42b8763d0d096ae5a7f97e13eae3199a4 WatchSource:0}: Error finding container b7455ef306e3f6664553198a9c172de42b8763d0d096ae5a7f97e13eae3199a4: Status 404 returned error can't find the container with id b7455ef306e3f6664553198a9c172de42b8763d0d096ae5a7f97e13eae3199a4 Apr 20 19:22:53.936014 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:53.935971 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" event={"ID":"c94687d9-038f-401c-95c1-1a65b578340b","Type":"ContainerStarted","Data":"b7455ef306e3f6664553198a9c172de42b8763d0d096ae5a7f97e13eae3199a4"} Apr 20 19:22:53.937403 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:53.937367 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk" event={"ID":"92b63e56-5288-415d-a75a-4c18d45ecf72","Type":"ContainerStarted","Data":"49d89c510a8b9c56c5d072387515144f804d1bf0a0991b21b9b19797e40d4df9"} Apr 20 19:22:54.940619 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:54.940577 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk" event={"ID":"92b63e56-5288-415d-a75a-4c18d45ecf72","Type":"ContainerStarted","Data":"ab5e1262f247450e691427b51580610f93d6fa310b807c7f17fd30f59b79611d"} Apr 20 19:22:54.955399 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:54.955337 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zb2mk" podStartSLOduration=1.68488824 podStartE2EDuration="2.955322555s" podCreationTimestamp="2026-04-20 19:22:52 +0000 UTC" firstStartedPulling="2026-04-20 19:22:52.981605629 +0000 UTC m=+140.137155430" lastFinishedPulling="2026-04-20 19:22:54.252039932 +0000 UTC m=+141.407589745" observedRunningTime="2026-04-20 19:22:54.954106582 +0000 UTC m=+142.109656401" watchObservedRunningTime="2026-04-20 19:22:54.955322555 +0000 UTC m=+142.110872376" Apr 20 19:22:55.944433 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:55.944404 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/0.log" Apr 20 19:22:55.944885 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:55.944444 2580 generic.go:358] "Generic (PLEG): container finished" podID="c94687d9-038f-401c-95c1-1a65b578340b" containerID="2ad3a7fe6f398317694bc4f4548e2ed58d9a1ab278d6f5694c589d1f3a7a45a1" exitCode=255 Apr 20 19:22:55.944885 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:55.944531 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" event={"ID":"c94687d9-038f-401c-95c1-1a65b578340b","Type":"ContainerDied","Data":"2ad3a7fe6f398317694bc4f4548e2ed58d9a1ab278d6f5694c589d1f3a7a45a1"} Apr 20 19:22:55.944885 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:55.944723 2580 scope.go:117] "RemoveContainer" containerID="2ad3a7fe6f398317694bc4f4548e2ed58d9a1ab278d6f5694c589d1f3a7a45a1" Apr 20 19:22:56.948405 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:56.948378 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/1.log" Apr 20 19:22:56.948829 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:56.948706 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/0.log" Apr 20 19:22:56.948829 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:56.948740 2580 generic.go:358] "Generic (PLEG): container finished" podID="c94687d9-038f-401c-95c1-1a65b578340b" containerID="8220328530b3fa266455ff5f3d88404baaba4e2ff00231851f1507a759afe7d6" exitCode=255 Apr 20 19:22:56.948829 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:56.948794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" event={"ID":"c94687d9-038f-401c-95c1-1a65b578340b","Type":"ContainerDied","Data":"8220328530b3fa266455ff5f3d88404baaba4e2ff00231851f1507a759afe7d6"} Apr 20 19:22:56.948829 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:56.948820 2580 scope.go:117] "RemoveContainer" containerID="2ad3a7fe6f398317694bc4f4548e2ed58d9a1ab278d6f5694c589d1f3a7a45a1" Apr 20 19:22:56.949101 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:56.949076 2580 scope.go:117] "RemoveContainer" containerID="8220328530b3fa266455ff5f3d88404baaba4e2ff00231851f1507a759afe7d6" Apr 20 19:22:56.949331 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:56.949306 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xl6pd_openshift-console-operator(c94687d9-038f-401c-95c1-1a65b578340b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" podUID="c94687d9-038f-401c-95c1-1a65b578340b" Apr 20 19:22:57.952139 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:57.952111 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/1.log" Apr 20 19:22:57.952584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:57.952458 2580 scope.go:117] "RemoveContainer" containerID="8220328530b3fa266455ff5f3d88404baaba4e2ff00231851f1507a759afe7d6" Apr 20 19:22:57.952662 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:57.952619 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xl6pd_openshift-console-operator(c94687d9-038f-401c-95c1-1a65b578340b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" podUID="c94687d9-038f-401c-95c1-1a65b578340b" Apr 20 19:22:58.250539 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:58.250458 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:58.250539 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:58.250515 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:22:58.250737 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:58.250639 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:23:14.250615229 +0000 UTC m=+161.406165045 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : configmap references non-existent config key: service-ca.crt Apr 20 19:22:58.250737 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:58.250676 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:22:58.250737 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:22:58.250729 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs podName:b2caeaea-f388-4a16-a139-404c07f66f1e nodeName:}" failed. No retries permitted until 2026-04-20 19:23:14.250717279 +0000 UTC m=+161.406267075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs") pod "router-default-76f79b8cc6-4kmk9" (UID: "b2caeaea-f388-4a16-a139-404c07f66f1e") : secret "router-metrics-certs-default" not found Apr 20 19:22:59.098234 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.098198 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j25ff"] Apr 20 19:22:59.102292 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.102276 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.104542 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.104524 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 19:22:59.105419 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.105394 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 19:22:59.105419 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.105412 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 19:22:59.105537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.105413 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-vkxf4\"" Apr 20 19:22:59.105537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.105435 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 19:22:59.107031 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.107002 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j25ff"] Apr 20 19:22:59.159298 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.159264 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/62841671-3d72-4f03-b100-d31f10a5a448-signing-key\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.159298 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.159297 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/62841671-3d72-4f03-b100-d31f10a5a448-signing-cabundle\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.159482 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.159399 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjr77\" (UniqueName: \"kubernetes.io/projected/62841671-3d72-4f03-b100-d31f10a5a448-kube-api-access-hjr77\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.260590 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.260554 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjr77\" (UniqueName: \"kubernetes.io/projected/62841671-3d72-4f03-b100-d31f10a5a448-kube-api-access-hjr77\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.260746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.260692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/62841671-3d72-4f03-b100-d31f10a5a448-signing-key\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.260746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.260721 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/62841671-3d72-4f03-b100-d31f10a5a448-signing-cabundle\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.261356 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.261335 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/62841671-3d72-4f03-b100-d31f10a5a448-signing-cabundle\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.263396 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.263373 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/62841671-3d72-4f03-b100-d31f10a5a448-signing-key\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.268946 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.268916 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjr77\" (UniqueName: \"kubernetes.io/projected/62841671-3d72-4f03-b100-d31f10a5a448-kube-api-access-hjr77\") pod \"service-ca-865cb79987-j25ff\" (UID: \"62841671-3d72-4f03-b100-d31f10a5a448\") " pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.411569 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.411496 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-j25ff" Apr 20 19:22:59.529241 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.529210 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j25ff"] Apr 20 19:22:59.532657 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:22:59.532622 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62841671_3d72_4f03_b100_d31f10a5a448.slice/crio-71368bb2cf9ee7103352e792223e2d18347b19ac9239d6651b6530db7807e8b5 WatchSource:0}: Error finding container 71368bb2cf9ee7103352e792223e2d18347b19ac9239d6651b6530db7807e8b5: Status 404 returned error can't find the container with id 71368bb2cf9ee7103352e792223e2d18347b19ac9239d6651b6530db7807e8b5 Apr 20 19:22:59.956446 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:22:59.956415 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-j25ff" event={"ID":"62841671-3d72-4f03-b100-d31f10a5a448","Type":"ContainerStarted","Data":"71368bb2cf9ee7103352e792223e2d18347b19ac9239d6651b6530db7807e8b5"} Apr 20 19:23:01.962384 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:01.962351 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-j25ff" event={"ID":"62841671-3d72-4f03-b100-d31f10a5a448","Type":"ContainerStarted","Data":"a70ea97ca7bf4df822e94cb68ac2df50f2af34ac3397787f4e8a100045138192"} Apr 20 19:23:01.977002 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:01.976957 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-j25ff" podStartSLOduration=1.274764533 podStartE2EDuration="2.976942344s" podCreationTimestamp="2026-04-20 19:22:59 +0000 UTC" firstStartedPulling="2026-04-20 19:22:59.534429768 +0000 UTC m=+146.689979568" lastFinishedPulling="2026-04-20 19:23:01.236607573 +0000 UTC m=+148.392157379" observedRunningTime="2026-04-20 19:23:01.976222791 +0000 UTC m=+149.131772609" watchObservedRunningTime="2026-04-20 19:23:01.976942344 +0000 UTC m=+149.132492162" Apr 20 19:23:02.966876 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:02.966846 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:23:02.966876 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:02.966879 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:23:02.967312 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:02.967174 2580 scope.go:117] "RemoveContainer" containerID="8220328530b3fa266455ff5f3d88404baaba4e2ff00231851f1507a759afe7d6" Apr 20 19:23:02.967350 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:02.967334 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xl6pd_openshift-console-operator(c94687d9-038f-401c-95c1-1a65b578340b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" podUID="c94687d9-038f-401c-95c1-1a65b578340b" Apr 20 19:23:10.685363 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:10.685314 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" podUID="3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" Apr 20 19:23:10.693489 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:10.693456 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" podUID="3fa1e505-222b-4d26-b6c6-b500bff9d597" Apr 20 19:23:10.702126 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:10.702081 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8q97z" podUID="b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2" Apr 20 19:23:10.710242 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:10.710216 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-h2wph" podUID="593cc9e9-b499-4686-897a-e1b604685e20" Apr 20 19:23:10.987335 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:10.987220 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h2wph" Apr 20 19:23:10.987335 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:10.987275 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:23:10.987543 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:10.987467 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:23:10.987543 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:10.987494 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:23:12.559546 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:12.559505 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mw5qh" podUID="a8ada6b3-5038-4d1c-bbe5-a9626c8c1987" Apr 20 19:23:14.287562 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.287501 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:14.287943 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.287601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:14.288106 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.288085 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2caeaea-f388-4a16-a139-404c07f66f1e-service-ca-bundle\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:14.290004 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.289985 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b2caeaea-f388-4a16-a139-404c07f66f1e-metrics-certs\") pod \"router-default-76f79b8cc6-4kmk9\" (UID: \"b2caeaea-f388-4a16-a139-404c07f66f1e\") " pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:14.548308 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.548230 2580 scope.go:117] "RemoveContainer" containerID="8220328530b3fa266455ff5f3d88404baaba4e2ff00231851f1507a759afe7d6" Apr 20 19:23:14.584795 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.584772 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:14.708432 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.708400 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-76f79b8cc6-4kmk9"] Apr 20 19:23:14.731227 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:14.731186 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2caeaea_f388_4a16_a139_404c07f66f1e.slice/crio-20d44b2750ab0b5ff788538c0d51207afac6f3ea886251e07477d9eb1cd162cf WatchSource:0}: Error finding container 20d44b2750ab0b5ff788538c0d51207afac6f3ea886251e07477d9eb1cd162cf: Status 404 returned error can't find the container with id 20d44b2750ab0b5ff788538c0d51207afac6f3ea886251e07477d9eb1cd162cf Apr 20 19:23:14.998004 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.997936 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:23:14.998284 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.998265 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/1.log" Apr 20 19:23:14.998342 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.998298 2580 generic.go:358] "Generic (PLEG): container finished" podID="c94687d9-038f-401c-95c1-1a65b578340b" containerID="223bb3785b2843113aaff86f0e3d253dece1fc6ae8a4bc166aff1d852bcad6c9" exitCode=255 Apr 20 19:23:14.998379 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.998364 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" event={"ID":"c94687d9-038f-401c-95c1-1a65b578340b","Type":"ContainerDied","Data":"223bb3785b2843113aaff86f0e3d253dece1fc6ae8a4bc166aff1d852bcad6c9"} Apr 20 19:23:14.998418 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.998395 2580 scope.go:117] "RemoveContainer" containerID="8220328530b3fa266455ff5f3d88404baaba4e2ff00231851f1507a759afe7d6" Apr 20 19:23:14.998750 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.998729 2580 scope.go:117] "RemoveContainer" containerID="223bb3785b2843113aaff86f0e3d253dece1fc6ae8a4bc166aff1d852bcad6c9" Apr 20 19:23:14.998937 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:14.998919 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-xl6pd_openshift-console-operator(c94687d9-038f-401c-95c1-1a65b578340b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" podUID="c94687d9-038f-401c-95c1-1a65b578340b" Apr 20 19:23:14.999646 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.999622 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" event={"ID":"b2caeaea-f388-4a16-a139-404c07f66f1e","Type":"ContainerStarted","Data":"50452333b03b027b5c4a99a25b6223a35997f35305a92b8ced8506393513723f"} Apr 20 19:23:14.999751 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:14.999650 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" event={"ID":"b2caeaea-f388-4a16-a139-404c07f66f1e","Type":"ContainerStarted","Data":"20d44b2750ab0b5ff788538c0d51207afac6f3ea886251e07477d9eb1cd162cf"} Apr 20 19:23:15.029582 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.029541 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" podStartSLOduration=33.029528876 podStartE2EDuration="33.029528876s" podCreationTimestamp="2026-04-20 19:22:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:23:15.028608317 +0000 UTC m=+162.184158135" watchObservedRunningTime="2026-04-20 19:23:15.029528876 +0000 UTC m=+162.185078695" Apr 20 19:23:15.585865 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.585828 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:15.588513 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.588492 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:15.699768 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.699731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:23:15.699941 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.699781 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:23:15.699984 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.699938 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:23:15.700018 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.699987 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:23:15.702407 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.702380 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/593cc9e9-b499-4686-897a-e1b604685e20-metrics-tls\") pod \"dns-default-h2wph\" (UID: \"593cc9e9-b499-4686-897a-e1b604685e20\") " pod="openshift-dns/dns-default-h2wph" Apr 20 19:23:15.702520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.702450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"image-registry-5755568fd6-9nlq5\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:23:15.702577 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.702556 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3fa1e505-222b-4d26-b6c6-b500bff9d597-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-76ngm\" (UID: \"3fa1e505-222b-4d26-b6c6-b500bff9d597\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:23:15.702612 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.702570 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2-cert\") pod \"ingress-canary-8q97z\" (UID: \"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2\") " pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:23:15.790821 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.790782 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ltv6t\"" Apr 20 19:23:15.790821 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.790810 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mf28l\"" Apr 20 19:23:15.791078 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.790792 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-wvwpr\"" Apr 20 19:23:15.791078 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.790792 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qtv7w\"" Apr 20 19:23:15.799637 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.799619 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8q97z" Apr 20 19:23:15.799637 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.799631 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" Apr 20 19:23:15.799759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.799654 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-h2wph" Apr 20 19:23:15.799759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.799753 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:23:15.967825 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.967794 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5755568fd6-9nlq5"] Apr 20 19:23:15.970795 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:15.970769 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d02caf8_5ad7_4d7c_aad3_54babb0bd46b.slice/crio-584f0a84a91da385f9285330b635289e369be80d2a895c9219896ba0bbe5cbbc WatchSource:0}: Error finding container 584f0a84a91da385f9285330b635289e369be80d2a895c9219896ba0bbe5cbbc: Status 404 returned error can't find the container with id 584f0a84a91da385f9285330b635289e369be80d2a895c9219896ba0bbe5cbbc Apr 20 19:23:15.974058 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:15.974029 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-h2wph"] Apr 20 19:23:15.976879 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:15.976855 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod593cc9e9_b499_4686_897a_e1b604685e20.slice/crio-9b6af26519f9a5296ce6a815cde00ab6f693694287b278f9ae01acead1073a22 WatchSource:0}: Error finding container 9b6af26519f9a5296ce6a815cde00ab6f693694287b278f9ae01acead1073a22: Status 404 returned error can't find the container with id 9b6af26519f9a5296ce6a815cde00ab6f693694287b278f9ae01acead1073a22 Apr 20 19:23:16.003548 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:16.003530 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:23:16.004612 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:16.004587 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" event={"ID":"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b","Type":"ContainerStarted","Data":"584f0a84a91da385f9285330b635289e369be80d2a895c9219896ba0bbe5cbbc"} Apr 20 19:23:16.005599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:16.005581 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h2wph" event={"ID":"593cc9e9-b499-4686-897a-e1b604685e20","Type":"ContainerStarted","Data":"9b6af26519f9a5296ce6a815cde00ab6f693694287b278f9ae01acead1073a22"} Apr 20 19:23:16.005761 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:16.005744 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:16.007050 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:16.007031 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-76f79b8cc6-4kmk9" Apr 20 19:23:16.198966 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:16.198884 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8q97z"] Apr 20 19:23:16.201890 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:16.201860 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-76ngm"] Apr 20 19:23:16.205598 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:16.205574 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7b97f68_52bc_4ef7_81c1_cbf2a7da14a2.slice/crio-bfbb0d0f19129ef33c8a416955bca25faab08f74dd19696071ca3ddca41e398d WatchSource:0}: Error finding container bfbb0d0f19129ef33c8a416955bca25faab08f74dd19696071ca3ddca41e398d: Status 404 returned error can't find the container with id bfbb0d0f19129ef33c8a416955bca25faab08f74dd19696071ca3ddca41e398d Apr 20 19:23:16.206100 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:16.206079 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa1e505_222b_4d26_b6c6_b500bff9d597.slice/crio-5be3d268113b88a426f2a26c799b53ea5eb211906f0c21adb1c6bd0108e59ce3 WatchSource:0}: Error finding container 5be3d268113b88a426f2a26c799b53ea5eb211906f0c21adb1c6bd0108e59ce3: Status 404 returned error can't find the container with id 5be3d268113b88a426f2a26c799b53ea5eb211906f0c21adb1c6bd0108e59ce3 Apr 20 19:23:17.009350 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:17.009310 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8q97z" event={"ID":"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2","Type":"ContainerStarted","Data":"bfbb0d0f19129ef33c8a416955bca25faab08f74dd19696071ca3ddca41e398d"} Apr 20 19:23:17.010982 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:17.010936 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" event={"ID":"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b","Type":"ContainerStarted","Data":"2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba"} Apr 20 19:23:17.011105 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:17.011069 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:23:17.012585 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:17.012555 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" event={"ID":"3fa1e505-222b-4d26-b6c6-b500bff9d597","Type":"ContainerStarted","Data":"5be3d268113b88a426f2a26c799b53ea5eb211906f0c21adb1c6bd0108e59ce3"} Apr 20 19:23:17.036117 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:17.035973 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" podStartSLOduration=163.035955855 podStartE2EDuration="2m43.035955855s" podCreationTimestamp="2026-04-20 19:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:23:17.035596636 +0000 UTC m=+164.191146467" watchObservedRunningTime="2026-04-20 19:23:17.035955855 +0000 UTC m=+164.191505676" Apr 20 19:23:18.733724 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.733641 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6xz5p"] Apr 20 19:23:18.736587 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.736567 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.738741 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.738722 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:23:18.738741 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.738735 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:23:18.739606 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.739586 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:23:18.739727 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.739660 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:23:18.739786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.739725 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-d2rzs\"" Apr 20 19:23:18.749141 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.749120 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6xz5p"] Apr 20 19:23:18.811420 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.811391 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24"] Apr 20 19:23:18.814282 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.814262 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" Apr 20 19:23:18.817573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.817550 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-lk6sg\"" Apr 20 19:23:18.817779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.817745 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 19:23:18.818574 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.818533 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q"] Apr 20 19:23:18.821323 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.821304 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q" Apr 20 19:23:18.823484 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.823439 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-6bdvd\"" Apr 20 19:23:18.825679 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.825658 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24"] Apr 20 19:23:18.827315 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.827297 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/352db113-3b67-4009-a8e0-926d28c4af22-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.827422 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.827329 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r269\" (UniqueName: \"kubernetes.io/projected/352db113-3b67-4009-a8e0-926d28c4af22-kube-api-access-4r269\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.827422 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.827362 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/352db113-3b67-4009-a8e0-926d28c4af22-data-volume\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.827422 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.827419 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/352db113-3b67-4009-a8e0-926d28c4af22-crio-socket\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.827575 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.827477 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/352db113-3b67-4009-a8e0-926d28c4af22-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.843994 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.843970 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q"] Apr 20 19:23:18.928596 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928566 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/352db113-3b67-4009-a8e0-926d28c4af22-data-volume\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.928596 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/352db113-3b67-4009-a8e0-926d28c4af22-crio-socket\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.928907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928645 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fdde5ed3-13c4-4768-8526-e3485db975eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fdl24\" (UID: \"fdde5ed3-13c4-4768-8526-e3485db975eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" Apr 20 19:23:18.928907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928666 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/352db113-3b67-4009-a8e0-926d28c4af22-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.928907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928704 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/352db113-3b67-4009-a8e0-926d28c4af22-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.928907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928774 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r269\" (UniqueName: \"kubernetes.io/projected/352db113-3b67-4009-a8e0-926d28c4af22-kube-api-access-4r269\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.928907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928784 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/352db113-3b67-4009-a8e0-926d28c4af22-crio-socket\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.928907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928818 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxqlx\" (UniqueName: \"kubernetes.io/projected/44ed524a-4aff-46b8-94cb-1df4d4aa3f4a-kube-api-access-sxqlx\") pod \"network-check-source-8894fc9bd-rtw9q\" (UID: \"44ed524a-4aff-46b8-94cb-1df4d4aa3f4a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q" Apr 20 19:23:18.928907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.928905 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/352db113-3b67-4009-a8e0-926d28c4af22-data-volume\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.929239 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.929218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/352db113-3b67-4009-a8e0-926d28c4af22-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.931080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.931060 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/352db113-3b67-4009-a8e0-926d28c4af22-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:18.937571 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:18.937547 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r269\" (UniqueName: \"kubernetes.io/projected/352db113-3b67-4009-a8e0-926d28c4af22-kube-api-access-4r269\") pod \"insights-runtime-extractor-6xz5p\" (UID: \"352db113-3b67-4009-a8e0-926d28c4af22\") " pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:19.020222 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.020181 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h2wph" event={"ID":"593cc9e9-b499-4686-897a-e1b604685e20","Type":"ContainerStarted","Data":"e47e42b4dd7ab03b2526637f598bfdc24810a4e122bb6652b013fe9c62a31ea0"} Apr 20 19:23:19.020380 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.020221 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-h2wph" event={"ID":"593cc9e9-b499-4686-897a-e1b604685e20","Type":"ContainerStarted","Data":"7f24dc22bd30e63c456b7197bfc1149cbd2c8e6b05e5e588d11356012500b314"} Apr 20 19:23:19.020380 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.020336 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-h2wph" Apr 20 19:23:19.021648 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.021625 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" event={"ID":"3fa1e505-222b-4d26-b6c6-b500bff9d597","Type":"ContainerStarted","Data":"2991d77e03ce376d179e42914df931db7368284122cd04e766ccf39c0bd063b5"} Apr 20 19:23:19.022719 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.022701 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8q97z" event={"ID":"b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2","Type":"ContainerStarted","Data":"c16d6274c83f636a6e9d509f8b8ce1908c4a4506c85256b59d28331e1eabcc3c"} Apr 20 19:23:19.029437 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.029418 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxqlx\" (UniqueName: \"kubernetes.io/projected/44ed524a-4aff-46b8-94cb-1df4d4aa3f4a-kube-api-access-sxqlx\") pod \"network-check-source-8894fc9bd-rtw9q\" (UID: \"44ed524a-4aff-46b8-94cb-1df4d4aa3f4a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q" Apr 20 19:23:19.029536 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.029480 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fdde5ed3-13c4-4768-8526-e3485db975eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fdl24\" (UID: \"fdde5ed3-13c4-4768-8526-e3485db975eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" Apr 20 19:23:19.031822 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.031802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/fdde5ed3-13c4-4768-8526-e3485db975eb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fdl24\" (UID: \"fdde5ed3-13c4-4768-8526-e3485db975eb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" Apr 20 19:23:19.038328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.038294 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-h2wph" podStartSLOduration=129.933824362 podStartE2EDuration="2m12.038283117s" podCreationTimestamp="2026-04-20 19:21:07 +0000 UTC" firstStartedPulling="2026-04-20 19:23:15.97959009 +0000 UTC m=+163.135139889" lastFinishedPulling="2026-04-20 19:23:18.084048844 +0000 UTC m=+165.239598644" observedRunningTime="2026-04-20 19:23:19.037412369 +0000 UTC m=+166.192962201" watchObservedRunningTime="2026-04-20 19:23:19.038283117 +0000 UTC m=+166.193832914" Apr 20 19:23:19.041176 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.041158 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxqlx\" (UniqueName: \"kubernetes.io/projected/44ed524a-4aff-46b8-94cb-1df4d4aa3f4a-kube-api-access-sxqlx\") pod \"network-check-source-8894fc9bd-rtw9q\" (UID: \"44ed524a-4aff-46b8-94cb-1df4d4aa3f4a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q" Apr 20 19:23:19.047047 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.047029 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6xz5p" Apr 20 19:23:19.054428 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.054394 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-76ngm" podStartSLOduration=154.179835992 podStartE2EDuration="2m36.054385088s" podCreationTimestamp="2026-04-20 19:20:43 +0000 UTC" firstStartedPulling="2026-04-20 19:23:16.208362751 +0000 UTC m=+163.363912550" lastFinishedPulling="2026-04-20 19:23:18.082911832 +0000 UTC m=+165.238461646" observedRunningTime="2026-04-20 19:23:19.053464893 +0000 UTC m=+166.209014704" watchObservedRunningTime="2026-04-20 19:23:19.054385088 +0000 UTC m=+166.209934906" Apr 20 19:23:19.067462 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.067416 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8q97z" podStartSLOduration=130.188101117 podStartE2EDuration="2m12.067400911s" podCreationTimestamp="2026-04-20 19:21:07 +0000 UTC" firstStartedPulling="2026-04-20 19:23:16.207824965 +0000 UTC m=+163.363374774" lastFinishedPulling="2026-04-20 19:23:18.087124769 +0000 UTC m=+165.242674568" observedRunningTime="2026-04-20 19:23:19.06694173 +0000 UTC m=+166.222491549" watchObservedRunningTime="2026-04-20 19:23:19.067400911 +0000 UTC m=+166.222950727" Apr 20 19:23:19.123318 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.123290 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" Apr 20 19:23:19.135271 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.132215 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q" Apr 20 19:23:19.182456 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.182415 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6xz5p"] Apr 20 19:23:19.269456 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.269432 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24"] Apr 20 19:23:19.272893 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:19.272870 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdde5ed3_13c4_4768_8526_e3485db975eb.slice/crio-876dd8135c38b85a992a2ce22cc93a32d424cd9089636c361f4a58469577dfe4 WatchSource:0}: Error finding container 876dd8135c38b85a992a2ce22cc93a32d424cd9089636c361f4a58469577dfe4: Status 404 returned error can't find the container with id 876dd8135c38b85a992a2ce22cc93a32d424cd9089636c361f4a58469577dfe4 Apr 20 19:23:19.283594 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:19.283572 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q"] Apr 20 19:23:19.287421 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:19.287149 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ed524a_4aff_46b8_94cb_1df4d4aa3f4a.slice/crio-35fabf0033d2ca4044359b419801dda269f26eae39e3e080914da976c29cc7c1 WatchSource:0}: Error finding container 35fabf0033d2ca4044359b419801dda269f26eae39e3e080914da976c29cc7c1: Status 404 returned error can't find the container with id 35fabf0033d2ca4044359b419801dda269f26eae39e3e080914da976c29cc7c1 Apr 20 19:23:20.027513 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:20.027478 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q" event={"ID":"44ed524a-4aff-46b8-94cb-1df4d4aa3f4a","Type":"ContainerStarted","Data":"0a0f0348cf780db8957c663e3fdecb3fe2cc9833e4137ad557003a7023a04134"} Apr 20 19:23:20.027973 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:20.027521 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q" event={"ID":"44ed524a-4aff-46b8-94cb-1df4d4aa3f4a","Type":"ContainerStarted","Data":"35fabf0033d2ca4044359b419801dda269f26eae39e3e080914da976c29cc7c1"} Apr 20 19:23:20.028865 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:20.028839 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" event={"ID":"fdde5ed3-13c4-4768-8526-e3485db975eb","Type":"ContainerStarted","Data":"876dd8135c38b85a992a2ce22cc93a32d424cd9089636c361f4a58469577dfe4"} Apr 20 19:23:20.030983 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:20.030925 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6xz5p" event={"ID":"352db113-3b67-4009-a8e0-926d28c4af22","Type":"ContainerStarted","Data":"c7ca881600ac769b35c3c6a12b54ccd493eb088a6c2374683f1f7ec2be76c414"} Apr 20 19:23:20.030983 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:20.030955 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6xz5p" event={"ID":"352db113-3b67-4009-a8e0-926d28c4af22","Type":"ContainerStarted","Data":"98cfef55fc2947e511f2ff1480ac61ae0a32ca61ac9bb1ed7051f2512aac2ccc"} Apr 20 19:23:20.043300 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:20.042806 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-rtw9q" podStartSLOduration=2.042788466 podStartE2EDuration="2.042788466s" podCreationTimestamp="2026-04-20 19:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:23:20.042429486 +0000 UTC m=+167.197979321" watchObservedRunningTime="2026-04-20 19:23:20.042788466 +0000 UTC m=+167.198338286" Apr 20 19:23:21.034942 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:21.034888 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6xz5p" event={"ID":"352db113-3b67-4009-a8e0-926d28c4af22","Type":"ContainerStarted","Data":"0d5194e40c90ab8c3fc5ebc8758fe4db6c9cf9123e168fa6ccb3e2594cce3ffd"} Apr 20 19:23:21.036429 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:21.036395 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" event={"ID":"fdde5ed3-13c4-4768-8526-e3485db975eb","Type":"ContainerStarted","Data":"81c67dafad6f3efbeabc4af795ca503109e111c6e218c0c85737e0b4e4082b05"} Apr 20 19:23:21.069070 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:21.069005 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" podStartSLOduration=1.951680248 podStartE2EDuration="3.068986712s" podCreationTimestamp="2026-04-20 19:23:18 +0000 UTC" firstStartedPulling="2026-04-20 19:23:19.27501803 +0000 UTC m=+166.430567832" lastFinishedPulling="2026-04-20 19:23:20.392324484 +0000 UTC m=+167.547874296" observedRunningTime="2026-04-20 19:23:21.068217584 +0000 UTC m=+168.223767406" watchObservedRunningTime="2026-04-20 19:23:21.068986712 +0000 UTC m=+168.224536534" Apr 20 19:23:22.041218 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:22.041175 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6xz5p" event={"ID":"352db113-3b67-4009-a8e0-926d28c4af22","Type":"ContainerStarted","Data":"f39a91069f61d9fbe90ce4efb731981a15cde2889b96940842cf8aba5754aaad"} Apr 20 19:23:22.041702 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:22.041321 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" Apr 20 19:23:22.045943 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:22.045922 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fdl24" Apr 20 19:23:22.058698 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:22.058654 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6xz5p" podStartSLOduration=1.725100532 podStartE2EDuration="4.058641508s" podCreationTimestamp="2026-04-20 19:23:18 +0000 UTC" firstStartedPulling="2026-04-20 19:23:19.258595643 +0000 UTC m=+166.414145455" lastFinishedPulling="2026-04-20 19:23:21.592136622 +0000 UTC m=+168.747686431" observedRunningTime="2026-04-20 19:23:22.057509068 +0000 UTC m=+169.213058888" watchObservedRunningTime="2026-04-20 19:23:22.058641508 +0000 UTC m=+169.214191326" Apr 20 19:23:22.967773 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:22.967735 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:23:22.967773 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:22.967778 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:23:22.968199 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:22.968169 2580 scope.go:117] "RemoveContainer" containerID="223bb3785b2843113aaff86f0e3d253dece1fc6ae8a4bc166aff1d852bcad6c9" Apr 20 19:23:22.968434 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:22.968411 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-xl6pd_openshift-console-operator(c94687d9-038f-401c-95c1-1a65b578340b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" podUID="c94687d9-038f-401c-95c1-1a65b578340b" Apr 20 19:23:24.548266 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:24.548207 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:23:27.160820 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.160778 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c"] Apr 20 19:23:27.165609 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.165588 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.167731 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.167706 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 19:23:27.168628 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.168610 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 19:23:27.168779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.168758 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:23:27.169404 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.169384 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:23:27.169545 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.169529 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2t5rt\"" Apr 20 19:23:27.169618 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.169539 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:23:27.176897 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.176875 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c"] Apr 20 19:23:27.186328 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.186302 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f8lth"] Apr 20 19:23:27.189835 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.189813 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.192272 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.192223 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-nkcff\"" Apr 20 19:23:27.192485 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.192464 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 19:23:27.192600 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.192582 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 19:23:27.192755 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.192464 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 19:23:27.203261 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.203221 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nqt9s"] Apr 20 19:23:27.206441 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.206415 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f8lth"] Apr 20 19:23:27.206533 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.206521 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.209236 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.208551 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:23:27.209236 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.208663 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:23:27.209236 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.208730 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:23:27.209236 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.208863 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kfq2q\"" Apr 20 19:23:27.295886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.295842 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ff99f872-5830-475e-b0ab-2019f63a53c2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.295886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.295887 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff99f872-5830-475e-b0ab-2019f63a53c2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.296160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.295908 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-wtmp\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.295986 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-root\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296015 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw6jc\" (UniqueName: \"kubernetes.io/projected/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-api-access-tw6jc\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.296160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296045 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.296160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296071 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5hq6\" (UniqueName: \"kubernetes.io/projected/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-kube-api-access-k5hq6\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296094 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.296160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296137 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75znr\" (UniqueName: \"kubernetes.io/projected/3f387372-bc50-4d3b-b439-725017d71a0f-kube-api-access-75znr\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.296160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296156 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296200 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296223 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-metrics-client-ca\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296270 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f387372-bc50-4d3b-b439-725017d71a0f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296335 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296374 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-sys\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-textfile\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296437 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-accelerators-collector-config\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296493 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-tls\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.296584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.296522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.397193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397140 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f387372-bc50-4d3b-b439-725017d71a0f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.397193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397198 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.397461 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-sys\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.397461 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397268 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-textfile\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.397461 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397357 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-sys\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.397461 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:27.397382 2580 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 19:23:27.397659 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:27.397472 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-tls podName:ff99f872-5830-475e-b0ab-2019f63a53c2 nodeName:}" failed. No retries permitted until 2026-04-20 19:23:27.897451187 +0000 UTC m=+175.053000997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-f8lth" (UID: "ff99f872-5830-475e-b0ab-2019f63a53c2") : secret "kube-state-metrics-tls" not found Apr 20 19:23:27.397659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397502 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-accelerators-collector-config\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.397659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-tls\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.397659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397582 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.397659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397609 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-textfile\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.397659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397616 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ff99f872-5830-475e-b0ab-2019f63a53c2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.397659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff99f872-5830-475e-b0ab-2019f63a53c2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.398004 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397669 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-wtmp\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.398004 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:27.397705 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:23:27.398004 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-root\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.398004 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:27.397760 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-tls podName:d41e64ff-5632-4dbf-9594-58ad2cd1ccc5 nodeName:}" failed. No retries permitted until 2026-04-20 19:23:27.897744488 +0000 UTC m=+175.053294294 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-tls") pod "node-exporter-nqt9s" (UID: "d41e64ff-5632-4dbf-9594-58ad2cd1ccc5") : secret "node-exporter-tls" not found Apr 20 19:23:27.398004 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397771 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-root\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.398004 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.397971 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f387372-bc50-4d3b-b439-725017d71a0f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.398324 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398126 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ff99f872-5830-475e-b0ab-2019f63a53c2-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.398324 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398124 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw6jc\" (UniqueName: \"kubernetes.io/projected/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-api-access-tw6jc\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.398324 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398177 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.398324 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398191 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-wtmp\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.398324 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398208 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5hq6\" (UniqueName: \"kubernetes.io/projected/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-kube-api-access-k5hq6\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.398324 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398273 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.398324 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:27.398299 2580 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 19:23:27.398324 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75znr\" (UniqueName: \"kubernetes.io/projected/3f387372-bc50-4d3b-b439-725017d71a0f-kube-api-access-75znr\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.398730 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:23:27.398354 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-tls podName:3f387372-bc50-4d3b-b439-725017d71a0f nodeName:}" failed. No retries permitted until 2026-04-20 19:23:27.898338857 +0000 UTC m=+175.053888654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-p8t2c" (UID: "3f387372-bc50-4d3b-b439-725017d71a0f") : secret "openshift-state-metrics-tls" not found Apr 20 19:23:27.398730 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398373 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.398730 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398440 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.398730 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.398480 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-metrics-client-ca\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.399066 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.399041 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-metrics-client-ca\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.399485 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.399459 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.399665 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.399647 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-accelerators-collector-config\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.400224 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.400202 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff99f872-5830-475e-b0ab-2019f63a53c2-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.401464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.400950 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.401464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.401017 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.402076 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.402047 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.407072 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.407046 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw6jc\" (UniqueName: \"kubernetes.io/projected/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-api-access-tw6jc\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.407422 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.407396 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5hq6\" (UniqueName: \"kubernetes.io/projected/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-kube-api-access-k5hq6\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.407961 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.407937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75znr\" (UniqueName: \"kubernetes.io/projected/3f387372-bc50-4d3b-b439-725017d71a0f-kube-api-access-75znr\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.902293 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.902231 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:27.902502 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.902345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.902502 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.902383 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-tls\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.904915 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.904884 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d41e64ff-5632-4dbf-9594-58ad2cd1ccc5-node-exporter-tls\") pod \"node-exporter-nqt9s\" (UID: \"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5\") " pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:27.905022 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.904948 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff99f872-5830-475e-b0ab-2019f63a53c2-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-f8lth\" (UID: \"ff99f872-5830-475e-b0ab-2019f63a53c2\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:27.905022 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:27.904959 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f387372-bc50-4d3b-b439-725017d71a0f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-p8t2c\" (UID: \"3f387372-bc50-4d3b-b439-725017d71a0f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:28.076801 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.076760 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" Apr 20 19:23:28.100556 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.100518 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" Apr 20 19:23:28.117844 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.117809 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nqt9s" Apr 20 19:23:28.131223 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:28.130440 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41e64ff_5632_4dbf_9594_58ad2cd1ccc5.slice/crio-eb0987bf719e94674f182c2bf4bf35d9889e5f6d31304e5e521040dc12db7fb9 WatchSource:0}: Error finding container eb0987bf719e94674f182c2bf4bf35d9889e5f6d31304e5e521040dc12db7fb9: Status 404 returned error can't find the container with id eb0987bf719e94674f182c2bf4bf35d9889e5f6d31304e5e521040dc12db7fb9 Apr 20 19:23:28.230350 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.230316 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c"] Apr 20 19:23:28.233784 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:28.233754 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f387372_bc50_4d3b_b439_725017d71a0f.slice/crio-b6f59568ccd594a09e98ce11c0894f912de11c5ceb3415d66ddb78826f9ae035 WatchSource:0}: Error finding container b6f59568ccd594a09e98ce11c0894f912de11c5ceb3415d66ddb78826f9ae035: Status 404 returned error can't find the container with id b6f59568ccd594a09e98ce11c0894f912de11c5ceb3415d66ddb78826f9ae035 Apr 20 19:23:28.243031 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.243006 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-f8lth"] Apr 20 19:23:28.248293 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:28.248263 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff99f872_5830_475e_b0ab_2019f63a53c2.slice/crio-64dc0a5fa9de79cea9a7cff23f4bf1ae20f559bfd4034864113962ae8c739ec9 WatchSource:0}: Error finding container 64dc0a5fa9de79cea9a7cff23f4bf1ae20f559bfd4034864113962ae8c739ec9: Status 404 returned error can't find the container with id 64dc0a5fa9de79cea9a7cff23f4bf1ae20f559bfd4034864113962ae8c739ec9 Apr 20 19:23:28.253630 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.253604 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:23:28.258501 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.258479 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.261017 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.260751 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 19:23:28.261017 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.260773 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 19:23:28.261017 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.260784 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 19:23:28.261017 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.260835 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 19:23:28.261017 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.260756 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 19:23:28.261017 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.260894 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 19:23:28.261017 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.260753 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 19:23:28.261017 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.261007 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 19:23:28.261640 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.261315 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 19:23:28.263324 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.262695 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-b976t\"" Apr 20 19:23:28.269671 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.269649 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:23:28.406584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406545 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406586 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406777 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406655 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406777 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406685 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406777 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406718 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406777 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406741 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-web-config\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406916 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406813 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406916 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406841 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406987 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406910 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-out\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.406987 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406948 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-volume\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.407061 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.406986 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.407061 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.407051 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.407136 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.407071 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-782jb\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-kube-api-access-782jb\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.507786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.507786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507738 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-web-config\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.507786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.507786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507794 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507816 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-out\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507840 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-volume\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507870 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.507953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-782jb\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-kube-api-access-782jb\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.508031 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.508056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508474 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.508094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508474 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.508120 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.508474 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.508345 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.509506 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.509164 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.509506 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.509164 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.511046 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.511018 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-out\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.511046 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.511041 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.511285 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.511234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-web-config\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.511569 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.511545 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.511712 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.511679 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.512281 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.512234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-volume\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.512893 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.512868 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.513009 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.512975 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.513054 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.513036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.515032 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.515016 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-782jb\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-kube-api-access-782jb\") pod \"alertmanager-main-0\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.585759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.585725 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:23:28.720667 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:28.720636 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:23:28.723767 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:28.723732 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c58cdd6_e375_4b85_80bc_e01fbad7f866.slice/crio-f7c7f9894bd805168d27adcd514e76007dcb193e1282b742a37445dc8f6c749c WatchSource:0}: Error finding container f7c7f9894bd805168d27adcd514e76007dcb193e1282b742a37445dc8f6c749c: Status 404 returned error can't find the container with id f7c7f9894bd805168d27adcd514e76007dcb193e1282b742a37445dc8f6c749c Apr 20 19:23:29.033527 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.033498 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-h2wph" Apr 20 19:23:29.063303 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.063198 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" event={"ID":"ff99f872-5830-475e-b0ab-2019f63a53c2","Type":"ContainerStarted","Data":"64dc0a5fa9de79cea9a7cff23f4bf1ae20f559bfd4034864113962ae8c739ec9"} Apr 20 19:23:29.068327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.068290 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" event={"ID":"3f387372-bc50-4d3b-b439-725017d71a0f","Type":"ContainerStarted","Data":"d7b0646c2047e48c86a26a4ef659e4eb9dc6016e843d91442bceafa0fbd8d04f"} Apr 20 19:23:29.068451 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.068339 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" event={"ID":"3f387372-bc50-4d3b-b439-725017d71a0f","Type":"ContainerStarted","Data":"57b1cb5d73d51d0e2b682dc9d2ef51e488c3c1c1ab4e3a5549c1493e27297f56"} Apr 20 19:23:29.068451 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.068354 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" event={"ID":"3f387372-bc50-4d3b-b439-725017d71a0f","Type":"ContainerStarted","Data":"b6f59568ccd594a09e98ce11c0894f912de11c5ceb3415d66ddb78826f9ae035"} Apr 20 19:23:29.070295 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.070243 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerStarted","Data":"f7c7f9894bd805168d27adcd514e76007dcb193e1282b742a37445dc8f6c749c"} Apr 20 19:23:29.074784 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.074738 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nqt9s" event={"ID":"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5","Type":"ContainerStarted","Data":"eb0987bf719e94674f182c2bf4bf35d9889e5f6d31304e5e521040dc12db7fb9"} Apr 20 19:23:29.247221 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.246745 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4"] Apr 20 19:23:29.269106 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.269055 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4"] Apr 20 19:23:29.269315 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.269293 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.272015 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.271983 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 19:23:29.272160 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.272136 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 19:23:29.272229 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.272149 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 19:23:29.272653 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.272630 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 19:23:29.273806 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.273499 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 19:23:29.273806 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.273553 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4a1kcvkpivq99\"" Apr 20 19:23:29.273806 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.273501 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-dm8d7\"" Apr 20 19:23:29.417426 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.417390 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.417573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.417435 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-grpc-tls\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.417573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.417479 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.417573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.417504 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkqsq\" (UniqueName: \"kubernetes.io/projected/dd66325b-c9cf-4338-988a-2be2df804b9b-kube-api-access-vkqsq\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.417573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.417527 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-tls\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.417573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.417544 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd66325b-c9cf-4338-988a-2be2df804b9b-metrics-client-ca\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.417573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.417566 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.417767 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.417594 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.518522 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.518486 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.518522 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.518530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkqsq\" (UniqueName: \"kubernetes.io/projected/dd66325b-c9cf-4338-988a-2be2df804b9b-kube-api-access-vkqsq\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.518778 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.518552 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-tls\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.518778 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.518571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd66325b-c9cf-4338-988a-2be2df804b9b-metrics-client-ca\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.518778 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.518592 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.518778 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.518632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.518778 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.518696 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.518778 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.518731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-grpc-tls\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.519422 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.519372 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd66325b-c9cf-4338-988a-2be2df804b9b-metrics-client-ca\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.521578 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.521548 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-grpc-tls\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.521909 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.521881 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.522002 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.521940 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.522071 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.522043 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.522123 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.522065 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.522443 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.522421 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dd66325b-c9cf-4338-988a-2be2df804b9b-secret-thanos-querier-tls\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.525742 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.525716 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkqsq\" (UniqueName: \"kubernetes.io/projected/dd66325b-c9cf-4338-988a-2be2df804b9b-kube-api-access-vkqsq\") pod \"thanos-querier-c4bcd9dc5-m4dj4\" (UID: \"dd66325b-c9cf-4338-988a-2be2df804b9b\") " pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:29.630426 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:29.630339 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:30.079792 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:30.079757 2580 generic.go:358] "Generic (PLEG): container finished" podID="d41e64ff-5632-4dbf-9594-58ad2cd1ccc5" containerID="155e2da67158188db47e2f86da8d39d14169cdedf93ebf2016c38c377ca898d1" exitCode=0 Apr 20 19:23:30.079912 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:30.079844 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nqt9s" event={"ID":"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5","Type":"ContainerDied","Data":"155e2da67158188db47e2f86da8d39d14169cdedf93ebf2016c38c377ca898d1"} Apr 20 19:23:30.214720 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:30.214509 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4"] Apr 20 19:23:30.224291 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:30.223478 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd66325b_c9cf_4338_988a_2be2df804b9b.slice/crio-888c69538a7617a2ca2515a93bb19c25d8971aa5290bc8613ceeeb65cc2ef9fe WatchSource:0}: Error finding container 888c69538a7617a2ca2515a93bb19c25d8971aa5290bc8613ceeeb65cc2ef9fe: Status 404 returned error can't find the container with id 888c69538a7617a2ca2515a93bb19c25d8971aa5290bc8613ceeeb65cc2ef9fe Apr 20 19:23:31.085739 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.085697 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nqt9s" event={"ID":"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5","Type":"ContainerStarted","Data":"bd5382bee1fa9fecb820014cda6b50d9156e08ea9a1245f92dc89d52e0d0fa2c"} Apr 20 19:23:31.085739 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.085742 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nqt9s" event={"ID":"d41e64ff-5632-4dbf-9594-58ad2cd1ccc5","Type":"ContainerStarted","Data":"109f689e1b1a5be2b36c884755fb7c61e2eb0711e378415ef575aaa2dd91b138"} Apr 20 19:23:31.087925 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.087892 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" event={"ID":"3f387372-bc50-4d3b-b439-725017d71a0f","Type":"ContainerStarted","Data":"3c02ae1911aa1265e64e8021aa76610a0e88f4ecbd7b1f29771feb2b77feacd8"} Apr 20 19:23:31.089126 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.089101 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" event={"ID":"dd66325b-c9cf-4338-988a-2be2df804b9b","Type":"ContainerStarted","Data":"888c69538a7617a2ca2515a93bb19c25d8971aa5290bc8613ceeeb65cc2ef9fe"} Apr 20 19:23:31.091125 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.091096 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" event={"ID":"ff99f872-5830-475e-b0ab-2019f63a53c2","Type":"ContainerStarted","Data":"c37afa9329bb08e2f977ceef78d1df5a92562467304c932cb23b53302ee67a6d"} Apr 20 19:23:31.091125 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.091124 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" event={"ID":"ff99f872-5830-475e-b0ab-2019f63a53c2","Type":"ContainerStarted","Data":"691c811484e345e2b4cdfc14af5107deb3fbe63bf9a68a8db4eb03a954e15f29"} Apr 20 19:23:31.091329 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.091135 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" event={"ID":"ff99f872-5830-475e-b0ab-2019f63a53c2","Type":"ContainerStarted","Data":"f551fbb35bb7a31f54bbac0f72970e8a3ce2030c795adda7036edc7d5ab227ac"} Apr 20 19:23:31.092475 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.092453 2580 generic.go:358] "Generic (PLEG): container finished" podID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerID="c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2" exitCode=0 Apr 20 19:23:31.092599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.092484 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerDied","Data":"c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2"} Apr 20 19:23:31.105612 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.105564 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nqt9s" podStartSLOduration=3.176978326 podStartE2EDuration="4.105548486s" podCreationTimestamp="2026-04-20 19:23:27 +0000 UTC" firstStartedPulling="2026-04-20 19:23:28.132233338 +0000 UTC m=+175.287783135" lastFinishedPulling="2026-04-20 19:23:29.060803488 +0000 UTC m=+176.216353295" observedRunningTime="2026-04-20 19:23:31.10445723 +0000 UTC m=+178.260007050" watchObservedRunningTime="2026-04-20 19:23:31.105548486 +0000 UTC m=+178.261098302" Apr 20 19:23:31.177960 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.177907 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-p8t2c" podStartSLOduration=2.490170367 podStartE2EDuration="4.177890213s" podCreationTimestamp="2026-04-20 19:23:27 +0000 UTC" firstStartedPulling="2026-04-20 19:23:28.361038532 +0000 UTC m=+175.516588333" lastFinishedPulling="2026-04-20 19:23:30.048758367 +0000 UTC m=+177.204308179" observedRunningTime="2026-04-20 19:23:31.175999541 +0000 UTC m=+178.331549361" watchObservedRunningTime="2026-04-20 19:23:31.177890213 +0000 UTC m=+178.333440079" Apr 20 19:23:31.204628 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:31.204577 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-f8lth" podStartSLOduration=2.4062001459999998 podStartE2EDuration="4.204552317s" podCreationTimestamp="2026-04-20 19:23:27 +0000 UTC" firstStartedPulling="2026-04-20 19:23:28.25036353 +0000 UTC m=+175.405913331" lastFinishedPulling="2026-04-20 19:23:30.048715694 +0000 UTC m=+177.204265502" observedRunningTime="2026-04-20 19:23:31.203677804 +0000 UTC m=+178.359227628" watchObservedRunningTime="2026-04-20 19:23:31.204552317 +0000 UTC m=+178.360102136" Apr 20 19:23:33.101646 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:33.101613 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" event={"ID":"dd66325b-c9cf-4338-988a-2be2df804b9b","Type":"ContainerStarted","Data":"ac5c94a45285c548ed575b264aace8b005887460fef868ce9a0b50f97cee75cd"} Apr 20 19:23:33.102065 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:33.101653 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" event={"ID":"dd66325b-c9cf-4338-988a-2be2df804b9b","Type":"ContainerStarted","Data":"ec53d5fbbf427d2e832a8f6e9e7d6aca64d21d7035c58095f7369b5ae75ddd42"} Apr 20 19:23:33.102065 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:33.101669 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" event={"ID":"dd66325b-c9cf-4338-988a-2be2df804b9b","Type":"ContainerStarted","Data":"6e9b0f1dee29ac21c7be544bcbd5e6c066e8527526073b9c8719e4fe9b50d65a"} Apr 20 19:23:33.104332 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:33.104306 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerStarted","Data":"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d"} Apr 20 19:23:33.104452 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:33.104335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerStarted","Data":"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d"} Apr 20 19:23:33.104452 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:33.104345 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerStarted","Data":"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9"} Apr 20 19:23:33.104452 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:33.104354 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerStarted","Data":"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165"} Apr 20 19:23:33.104452 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:33.104362 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerStarted","Data":"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408"} Apr 20 19:23:34.109659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:34.109563 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" event={"ID":"dd66325b-c9cf-4338-988a-2be2df804b9b","Type":"ContainerStarted","Data":"2c4c9a11e29b057db4918527887443c5ca117bcd533fd28aa389ce925fd5b8ac"} Apr 20 19:23:34.109659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:34.109606 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" event={"ID":"dd66325b-c9cf-4338-988a-2be2df804b9b","Type":"ContainerStarted","Data":"0026bba9e4687e6987bdf3f661801c934305ca215e1e4ff261cca66308d15491"} Apr 20 19:23:34.109659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:34.109616 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" event={"ID":"dd66325b-c9cf-4338-988a-2be2df804b9b","Type":"ContainerStarted","Data":"7a75afd4e7cc1862bae92dd43cf8a1e1fc7461192ed76cb2c51de9e9e4067891"} Apr 20 19:23:34.110162 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:34.109704 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:34.112344 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:34.112323 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerStarted","Data":"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37"} Apr 20 19:23:34.133098 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:34.133057 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" podStartSLOduration=1.570834843 podStartE2EDuration="5.133042523s" podCreationTimestamp="2026-04-20 19:23:29 +0000 UTC" firstStartedPulling="2026-04-20 19:23:30.227205974 +0000 UTC m=+177.382755786" lastFinishedPulling="2026-04-20 19:23:33.789413666 +0000 UTC m=+180.944963466" observedRunningTime="2026-04-20 19:23:34.131036707 +0000 UTC m=+181.286586548" watchObservedRunningTime="2026-04-20 19:23:34.133042523 +0000 UTC m=+181.288592341" Apr 20 19:23:34.163309 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:34.163230 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.098561362 podStartE2EDuration="6.163213938s" podCreationTimestamp="2026-04-20 19:23:28 +0000 UTC" firstStartedPulling="2026-04-20 19:23:28.726299676 +0000 UTC m=+175.881849475" lastFinishedPulling="2026-04-20 19:23:33.790952253 +0000 UTC m=+180.946502051" observedRunningTime="2026-04-20 19:23:34.161371 +0000 UTC m=+181.316920819" watchObservedRunningTime="2026-04-20 19:23:34.163213938 +0000 UTC m=+181.318763806" Apr 20 19:23:35.553982 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:35.553943 2580 scope.go:117] "RemoveContainer" containerID="223bb3785b2843113aaff86f0e3d253dece1fc6ae8a4bc166aff1d852bcad6c9" Apr 20 19:23:35.804367 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:35.804281 2580 patch_prober.go:28] interesting pod/image-registry-5755568fd6-9nlq5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 19:23:35.804367 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:35.804344 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" podUID="3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 19:23:36.122244 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.122160 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:23:36.122414 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.122289 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" event={"ID":"c94687d9-038f-401c-95c1-1a65b578340b","Type":"ContainerStarted","Data":"4a68fa255bbc6a270bdc1a4fc20fe7ba7fe6226a3c27b250934f0989a2b5f29b"} Apr 20 19:23:36.122604 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.122580 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:23:36.130449 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.130429 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" Apr 20 19:23:36.138560 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.138522 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-xl6pd" podStartSLOduration=41.912263138 podStartE2EDuration="44.138509312s" podCreationTimestamp="2026-04-20 19:22:52 +0000 UTC" firstStartedPulling="2026-04-20 19:22:53.098433219 +0000 UTC m=+140.253983020" lastFinishedPulling="2026-04-20 19:22:55.324679397 +0000 UTC m=+142.480229194" observedRunningTime="2026-04-20 19:23:36.136994249 +0000 UTC m=+183.292544073" watchObservedRunningTime="2026-04-20 19:23:36.138509312 +0000 UTC m=+183.294059131" Apr 20 19:23:36.301564 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.301531 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-gctks"] Apr 20 19:23:36.303704 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.303686 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-gctks" Apr 20 19:23:36.305760 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.305738 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 19:23:36.305877 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.305745 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-thnw5\"" Apr 20 19:23:36.305877 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.305792 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 19:23:36.312857 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.312814 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-gctks"] Apr 20 19:23:36.400607 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.400518 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glng5\" (UniqueName: \"kubernetes.io/projected/dfd03727-181d-4602-92a1-1407031aec92-kube-api-access-glng5\") pod \"downloads-6bcc868b7-gctks\" (UID: \"dfd03727-181d-4602-92a1-1407031aec92\") " pod="openshift-console/downloads-6bcc868b7-gctks" Apr 20 19:23:36.501503 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.501459 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glng5\" (UniqueName: \"kubernetes.io/projected/dfd03727-181d-4602-92a1-1407031aec92-kube-api-access-glng5\") pod \"downloads-6bcc868b7-gctks\" (UID: \"dfd03727-181d-4602-92a1-1407031aec92\") " pod="openshift-console/downloads-6bcc868b7-gctks" Apr 20 19:23:36.509634 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.509607 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glng5\" (UniqueName: \"kubernetes.io/projected/dfd03727-181d-4602-92a1-1407031aec92-kube-api-access-glng5\") pod \"downloads-6bcc868b7-gctks\" (UID: \"dfd03727-181d-4602-92a1-1407031aec92\") " pod="openshift-console/downloads-6bcc868b7-gctks" Apr 20 19:23:36.613479 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.613446 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-gctks" Apr 20 19:23:36.737826 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:36.737791 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-gctks"] Apr 20 19:23:36.740886 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:36.740850 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd03727_181d_4602_92a1_1407031aec92.slice/crio-17ad8c318b8f32a1872fc3fc3b0b493fccb7fad95da5167842202d6ef89a87e5 WatchSource:0}: Error finding container 17ad8c318b8f32a1872fc3fc3b0b493fccb7fad95da5167842202d6ef89a87e5: Status 404 returned error can't find the container with id 17ad8c318b8f32a1872fc3fc3b0b493fccb7fad95da5167842202d6ef89a87e5 Apr 20 19:23:37.127149 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:37.127099 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-gctks" event={"ID":"dfd03727-181d-4602-92a1-1407031aec92","Type":"ContainerStarted","Data":"17ad8c318b8f32a1872fc3fc3b0b493fccb7fad95da5167842202d6ef89a87e5"} Apr 20 19:23:38.018779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:38.018750 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:23:40.123843 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:40.123812 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-c4bcd9dc5-m4dj4" Apr 20 19:23:41.355798 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:41.355766 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5755568fd6-9nlq5"] Apr 20 19:23:46.428990 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.428953 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-79544b66db-4bjg6"] Apr 20 19:23:46.433496 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.433476 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.436034 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.436011 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 19:23:46.436167 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.436144 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 19:23:46.437157 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.437137 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 19:23:46.437275 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.437158 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 19:23:46.437350 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.437276 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 19:23:46.437413 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.437378 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-c6svh\"" Apr 20 19:23:46.459968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.459927 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79544b66db-4bjg6"] Apr 20 19:23:46.495407 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.495380 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-serving-cert\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.495558 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.495419 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-oauth-config\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.495558 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.495473 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-oauth-serving-cert\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.495558 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.495524 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhjck\" (UniqueName: \"kubernetes.io/projected/d5d99161-eec7-4e45-bc17-4ffe78c87e59-kube-api-access-xhjck\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.495727 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.495556 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-config\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.495727 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.495640 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-service-ca\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.596830 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.596787 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhjck\" (UniqueName: \"kubernetes.io/projected/d5d99161-eec7-4e45-bc17-4ffe78c87e59-kube-api-access-xhjck\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.596997 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.596844 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-config\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.596997 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.596879 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-service-ca\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.596997 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.596962 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-serving-cert\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.597164 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.597004 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-oauth-config\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.597164 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.597058 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-oauth-serving-cert\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.597703 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.597654 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-config\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.597833 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.597702 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-service-ca\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.597833 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.597721 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-oauth-serving-cert\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.599892 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.599866 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-oauth-config\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.600118 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.600096 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-serving-cert\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.604895 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.604871 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhjck\" (UniqueName: \"kubernetes.io/projected/d5d99161-eec7-4e45-bc17-4ffe78c87e59-kube-api-access-xhjck\") pod \"console-79544b66db-4bjg6\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.744525 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.744421 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:46.894709 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:46.894675 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79544b66db-4bjg6"] Apr 20 19:23:46.898472 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:46.898441 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d99161_eec7_4e45_bc17_4ffe78c87e59.slice/crio-728fab4880c5d72281773487d6096246bd63adc1ad2c0b2eb718803480fa1d87 WatchSource:0}: Error finding container 728fab4880c5d72281773487d6096246bd63adc1ad2c0b2eb718803480fa1d87: Status 404 returned error can't find the container with id 728fab4880c5d72281773487d6096246bd63adc1ad2c0b2eb718803480fa1d87 Apr 20 19:23:47.166417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:47.166375 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79544b66db-4bjg6" event={"ID":"d5d99161-eec7-4e45-bc17-4ffe78c87e59","Type":"ContainerStarted","Data":"728fab4880c5d72281773487d6096246bd63adc1ad2c0b2eb718803480fa1d87"} Apr 20 19:23:51.222977 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.222940 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-785d86dd9d-4zqct"] Apr 20 19:23:51.246941 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.246910 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-785d86dd9d-4zqct"] Apr 20 19:23:51.247124 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.247052 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.254950 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.254915 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 19:23:51.343148 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.343096 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-trusted-ca-bundle\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.343148 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.343155 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-oauth-serving-cert\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.343421 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.343262 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9smp\" (UniqueName: \"kubernetes.io/projected/85334d2a-c60c-4c7e-b77d-2f16011a6af7-kube-api-access-r9smp\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.343421 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.343305 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-config\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.343421 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.343333 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-service-ca\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.343421 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.343361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-serving-cert\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.343421 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.343376 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-oauth-config\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.444502 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.444460 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9smp\" (UniqueName: \"kubernetes.io/projected/85334d2a-c60c-4c7e-b77d-2f16011a6af7-kube-api-access-r9smp\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.444502 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.444509 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-config\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.444767 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.444536 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-service-ca\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.444767 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.444558 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-serving-cert\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.444767 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.444584 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-oauth-config\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.444767 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.444664 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-trusted-ca-bundle\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.444767 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.444694 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-oauth-serving-cert\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.445377 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.445329 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-service-ca\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.445377 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.445329 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-config\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.445573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.445402 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-oauth-serving-cert\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.445732 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.445709 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-trusted-ca-bundle\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.447524 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.447499 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-serving-cert\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.447628 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.447548 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-oauth-config\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.452698 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.452673 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9smp\" (UniqueName: \"kubernetes.io/projected/85334d2a-c60c-4c7e-b77d-2f16011a6af7-kube-api-access-r9smp\") pod \"console-785d86dd9d-4zqct\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:51.559224 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:51.559190 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:23:56.061432 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.061401 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-785d86dd9d-4zqct"] Apr 20 19:23:56.062666 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:23:56.062640 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85334d2a_c60c_4c7e_b77d_2f16011a6af7.slice/crio-18c54f40689c76c75ed10071ad1e478c8487d4f087ff6ff88236625d6a196f2f WatchSource:0}: Error finding container 18c54f40689c76c75ed10071ad1e478c8487d4f087ff6ff88236625d6a196f2f: Status 404 returned error can't find the container with id 18c54f40689c76c75ed10071ad1e478c8487d4f087ff6ff88236625d6a196f2f Apr 20 19:23:56.197372 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.197209 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79544b66db-4bjg6" event={"ID":"d5d99161-eec7-4e45-bc17-4ffe78c87e59","Type":"ContainerStarted","Data":"45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec"} Apr 20 19:23:56.198997 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.198966 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785d86dd9d-4zqct" event={"ID":"85334d2a-c60c-4c7e-b77d-2f16011a6af7","Type":"ContainerStarted","Data":"b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be"} Apr 20 19:23:56.199130 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.199003 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785d86dd9d-4zqct" event={"ID":"85334d2a-c60c-4c7e-b77d-2f16011a6af7","Type":"ContainerStarted","Data":"18c54f40689c76c75ed10071ad1e478c8487d4f087ff6ff88236625d6a196f2f"} Apr 20 19:23:56.200620 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.200595 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-gctks" event={"ID":"dfd03727-181d-4602-92a1-1407031aec92","Type":"ContainerStarted","Data":"4876a8d0c9867cec11ae66db3d8c551eb5581ff3bb9a2a3d0b61c5f41ae745f8"} Apr 20 19:23:56.201211 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.201184 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-gctks" Apr 20 19:23:56.202358 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.202331 2580 patch_prober.go:28] interesting pod/downloads-6bcc868b7-gctks container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.21:8080/\": dial tcp 10.132.0.21:8080: connect: connection refused" start-of-body= Apr 20 19:23:56.202440 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.202384 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-gctks" podUID="dfd03727-181d-4602-92a1-1407031aec92" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.21:8080/\": dial tcp 10.132.0.21:8080: connect: connection refused" Apr 20 19:23:56.215524 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.215484 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79544b66db-4bjg6" podStartSLOduration=1.181034162 podStartE2EDuration="10.215471641s" podCreationTimestamp="2026-04-20 19:23:46 +0000 UTC" firstStartedPulling="2026-04-20 19:23:46.900719952 +0000 UTC m=+194.056269752" lastFinishedPulling="2026-04-20 19:23:55.935157434 +0000 UTC m=+203.090707231" observedRunningTime="2026-04-20 19:23:56.213618802 +0000 UTC m=+203.369168626" watchObservedRunningTime="2026-04-20 19:23:56.215471641 +0000 UTC m=+203.371021461" Apr 20 19:23:56.229466 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.229424 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-gctks" podStartSLOduration=1.002256232 podStartE2EDuration="20.229409877s" podCreationTimestamp="2026-04-20 19:23:36 +0000 UTC" firstStartedPulling="2026-04-20 19:23:36.742781609 +0000 UTC m=+183.898331408" lastFinishedPulling="2026-04-20 19:23:55.969935244 +0000 UTC m=+203.125485053" observedRunningTime="2026-04-20 19:23:56.227381832 +0000 UTC m=+203.382931652" watchObservedRunningTime="2026-04-20 19:23:56.229409877 +0000 UTC m=+203.384959707" Apr 20 19:23:56.243643 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.243602 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-785d86dd9d-4zqct" podStartSLOduration=5.243588939 podStartE2EDuration="5.243588939s" podCreationTimestamp="2026-04-20 19:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:23:56.242374311 +0000 UTC m=+203.397924133" watchObservedRunningTime="2026-04-20 19:23:56.243588939 +0000 UTC m=+203.399138758" Apr 20 19:23:56.745443 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.745404 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:56.745443 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.745445 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:56.751335 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:56.751305 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:57.208668 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:57.208636 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:23:57.221940 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:23:57.221908 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-gctks" Apr 20 19:24:01.559369 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:01.559334 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:24:01.559809 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:01.559382 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:24:01.564857 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:01.564832 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:24:02.226564 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:02.226531 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:24:02.268713 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:02.268676 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79544b66db-4bjg6"] Apr 20 19:24:02.997932 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:02.997903 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1c58cdd6-e375-4b85-80bc-e01fbad7f866/init-config-reloader/0.log" Apr 20 19:24:03.193066 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:03.193027 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1c58cdd6-e375-4b85-80bc-e01fbad7f866/alertmanager/0.log" Apr 20 19:24:03.396111 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:03.396078 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1c58cdd6-e375-4b85-80bc-e01fbad7f866/config-reloader/0.log" Apr 20 19:24:03.593858 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:03.593823 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1c58cdd6-e375-4b85-80bc-e01fbad7f866/kube-rbac-proxy-web/0.log" Apr 20 19:24:03.792374 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:03.792342 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1c58cdd6-e375-4b85-80bc-e01fbad7f866/kube-rbac-proxy/0.log" Apr 20 19:24:03.992664 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:03.992637 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1c58cdd6-e375-4b85-80bc-e01fbad7f866/kube-rbac-proxy-metric/0.log" Apr 20 19:24:04.193318 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:04.193226 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1c58cdd6-e375-4b85-80bc-e01fbad7f866/prom-label-proxy/0.log" Apr 20 19:24:04.592907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:04.592879 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f8lth_ff99f872-5830-475e-b0ab-2019f63a53c2/kube-state-metrics/0.log" Apr 20 19:24:04.793414 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:04.793373 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f8lth_ff99f872-5830-475e-b0ab-2019f63a53c2/kube-rbac-proxy-main/0.log" Apr 20 19:24:04.992711 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:04.992638 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f8lth_ff99f872-5830-475e-b0ab-2019f63a53c2/kube-rbac-proxy-self/0.log" Apr 20 19:24:06.192614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.192578 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqt9s_d41e64ff-5632-4dbf-9594-58ad2cd1ccc5/init-textfile/0.log" Apr 20 19:24:06.377802 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.377757 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" podUID="3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" containerName="registry" containerID="cri-o://2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba" gracePeriod=30 Apr 20 19:24:06.393282 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.393262 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqt9s_d41e64ff-5632-4dbf-9594-58ad2cd1ccc5/node-exporter/0.log" Apr 20 19:24:06.593466 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.593443 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqt9s_d41e64ff-5632-4dbf-9594-58ad2cd1ccc5/kube-rbac-proxy/0.log" Apr 20 19:24:06.643849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.643828 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:24:06.790682 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.790607 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlxdl\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-kube-api-access-zlxdl\") pod \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " Apr 20 19:24:06.790682 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.790656 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-installation-pull-secrets\") pod \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " Apr 20 19:24:06.790682 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.790673 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-bound-sa-token\") pod \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " Apr 20 19:24:06.790987 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.790702 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-trusted-ca\") pod \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " Apr 20 19:24:06.790987 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.790840 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-certificates\") pod \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " Apr 20 19:24:06.790987 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.790945 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-image-registry-private-configuration\") pod \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " Apr 20 19:24:06.790987 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.790985 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-ca-trust-extracted\") pod \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " Apr 20 19:24:06.791198 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.791018 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") pod \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\" (UID: \"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b\") " Apr 20 19:24:06.791198 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.791101 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:06.791198 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.791169 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:06.791537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.791486 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-trusted-ca\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:06.791537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.791513 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-certificates\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:06.793642 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.793614 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-kube-api-access-zlxdl" (OuterVolumeSpecName: "kube-api-access-zlxdl") pod "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b"). InnerVolumeSpecName "kube-api-access-zlxdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:24:06.793642 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.793623 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:06.793927 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.793908 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:24:06.793927 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.793914 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:24:06.794067 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.793936 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:06.799580 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.799559 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" (UID: "3d02caf8-5ad7-4d7c-aad3-54babb0bd46b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:24:06.892828 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.892779 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-installation-pull-secrets\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:06.892828 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.892818 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-bound-sa-token\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:06.892828 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.892829 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-image-registry-private-configuration\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:06.892828 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.892840 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-ca-trust-extracted\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:06.893129 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.892849 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-registry-tls\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:06.893129 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:06.892859 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zlxdl\" (UniqueName: \"kubernetes.io/projected/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b-kube-api-access-zlxdl\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:07.239344 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.239267 2580 generic.go:358] "Generic (PLEG): container finished" podID="3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" containerID="2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba" exitCode=0 Apr 20 19:24:07.239820 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.239351 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" Apr 20 19:24:07.239820 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.239350 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" event={"ID":"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b","Type":"ContainerDied","Data":"2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba"} Apr 20 19:24:07.239820 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.239396 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5755568fd6-9nlq5" event={"ID":"3d02caf8-5ad7-4d7c-aad3-54babb0bd46b","Type":"ContainerDied","Data":"584f0a84a91da385f9285330b635289e369be80d2a895c9219896ba0bbe5cbbc"} Apr 20 19:24:07.239820 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.239416 2580 scope.go:117] "RemoveContainer" containerID="2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba" Apr 20 19:24:07.248782 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.248761 2580 scope.go:117] "RemoveContainer" containerID="2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba" Apr 20 19:24:07.249018 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:07.249000 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba\": container with ID starting with 2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba not found: ID does not exist" containerID="2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba" Apr 20 19:24:07.249081 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.249026 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba"} err="failed to get container status \"2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba\": rpc error: code = NotFound desc = could not find container \"2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba\": container with ID starting with 2e52c8f857c375712c4c889fc63579460cf1e8873cea04e67e015db29351bcba not found: ID does not exist" Apr 20 19:24:07.259604 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.259581 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5755568fd6-9nlq5"] Apr 20 19:24:07.262609 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.262590 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5755568fd6-9nlq5"] Apr 20 19:24:07.393393 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.393364 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p8t2c_3f387372-bc50-4d3b-b439-725017d71a0f/kube-rbac-proxy-main/0.log" Apr 20 19:24:07.552375 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.552335 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" path="/var/lib/kubelet/pods/3d02caf8-5ad7-4d7c-aad3-54babb0bd46b/volumes" Apr 20 19:24:07.592648 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.592619 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p8t2c_3f387372-bc50-4d3b-b439-725017d71a0f/kube-rbac-proxy-self/0.log" Apr 20 19:24:07.793379 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:07.793341 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p8t2c_3f387372-bc50-4d3b-b439-725017d71a0f/openshift-state-metrics/0.log" Apr 20 19:24:09.793177 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:09.793140 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-fdl24_fdde5ed3-13c4-4768-8526-e3485db975eb/prometheus-operator-admission-webhook/0.log" Apr 20 19:24:09.994497 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:09.994469 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/thanos-query/0.log" Apr 20 19:24:10.192776 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:10.192694 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/kube-rbac-proxy-web/0.log" Apr 20 19:24:10.393562 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:10.393524 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/kube-rbac-proxy/0.log" Apr 20 19:24:10.592517 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:10.592483 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/prom-label-proxy/0.log" Apr 20 19:24:10.792550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:10.792519 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/kube-rbac-proxy-rules/0.log" Apr 20 19:24:10.993506 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:10.993429 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/kube-rbac-proxy-metrics/0.log" Apr 20 19:24:11.192449 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:11.192414 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-76ngm_3fa1e505-222b-4d26-b6c6-b500bff9d597/networking-console-plugin/0.log" Apr 20 19:24:11.393293 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:11.393269 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:24:11.595310 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:11.595275 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/3.log" Apr 20 19:24:11.794172 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:11.794141 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-785d86dd9d-4zqct_85334d2a-c60c-4c7e-b77d-2f16011a6af7/console/0.log" Apr 20 19:24:11.993159 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:11.993131 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79544b66db-4bjg6_d5d99161-eec7-4e45-bc17-4ffe78c87e59/console/0.log" Apr 20 19:24:12.195131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:12.195039 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-gctks_dfd03727-181d-4602-92a1-1407031aec92/download-server/0.log" Apr 20 19:24:12.393722 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:12.393663 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76f79b8cc6-4kmk9_b2caeaea-f388-4a16-a139-404c07f66f1e/router/0.log" Apr 20 19:24:12.993016 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:12.992980 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8q97z_b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2/serve-healthcheck-canary/0.log" Apr 20 19:24:27.294068 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.294015 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-79544b66db-4bjg6" podUID="d5d99161-eec7-4e45-bc17-4ffe78c87e59" containerName="console" containerID="cri-o://45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec" gracePeriod=15 Apr 20 19:24:27.562443 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.562420 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79544b66db-4bjg6_d5d99161-eec7-4e45-bc17-4ffe78c87e59/console/0.log" Apr 20 19:24:27.562557 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.562478 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:24:27.690210 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690170 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-oauth-config\") pod \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " Apr 20 19:24:27.690417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690267 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-serving-cert\") pod \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " Apr 20 19:24:27.690417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690293 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-oauth-serving-cert\") pod \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " Apr 20 19:24:27.690417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690319 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhjck\" (UniqueName: \"kubernetes.io/projected/d5d99161-eec7-4e45-bc17-4ffe78c87e59-kube-api-access-xhjck\") pod \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " Apr 20 19:24:27.690417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690354 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-config\") pod \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " Apr 20 19:24:27.690417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690382 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-service-ca\") pod \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\" (UID: \"d5d99161-eec7-4e45-bc17-4ffe78c87e59\") " Apr 20 19:24:27.690737 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690712 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d5d99161-eec7-4e45-bc17-4ffe78c87e59" (UID: "d5d99161-eec7-4e45-bc17-4ffe78c87e59"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:27.690794 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690731 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-config" (OuterVolumeSpecName: "console-config") pod "d5d99161-eec7-4e45-bc17-4ffe78c87e59" (UID: "d5d99161-eec7-4e45-bc17-4ffe78c87e59"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:27.690900 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.690821 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-service-ca" (OuterVolumeSpecName: "service-ca") pod "d5d99161-eec7-4e45-bc17-4ffe78c87e59" (UID: "d5d99161-eec7-4e45-bc17-4ffe78c87e59"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:27.692718 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.692688 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d5d99161-eec7-4e45-bc17-4ffe78c87e59" (UID: "d5d99161-eec7-4e45-bc17-4ffe78c87e59"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:27.692718 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.692705 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d5d99161-eec7-4e45-bc17-4ffe78c87e59" (UID: "d5d99161-eec7-4e45-bc17-4ffe78c87e59"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:27.692846 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.692712 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d99161-eec7-4e45-bc17-4ffe78c87e59-kube-api-access-xhjck" (OuterVolumeSpecName: "kube-api-access-xhjck") pod "d5d99161-eec7-4e45-bc17-4ffe78c87e59" (UID: "d5d99161-eec7-4e45-bc17-4ffe78c87e59"). InnerVolumeSpecName "kube-api-access-xhjck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:24:27.791786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.791753 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:27.791786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.791782 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-service-ca\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:27.791786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.791792 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-oauth-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:27.791995 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.791802 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d99161-eec7-4e45-bc17-4ffe78c87e59-console-serving-cert\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:27.791995 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.791811 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d99161-eec7-4e45-bc17-4ffe78c87e59-oauth-serving-cert\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:27.791995 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:27.791819 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xhjck\" (UniqueName: \"kubernetes.io/projected/d5d99161-eec7-4e45-bc17-4ffe78c87e59-kube-api-access-xhjck\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:28.305468 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.305436 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79544b66db-4bjg6_d5d99161-eec7-4e45-bc17-4ffe78c87e59/console/0.log" Apr 20 19:24:28.305921 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.305480 2580 generic.go:358] "Generic (PLEG): container finished" podID="d5d99161-eec7-4e45-bc17-4ffe78c87e59" containerID="45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec" exitCode=2 Apr 20 19:24:28.305921 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.305524 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79544b66db-4bjg6" event={"ID":"d5d99161-eec7-4e45-bc17-4ffe78c87e59","Type":"ContainerDied","Data":"45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec"} Apr 20 19:24:28.305921 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.305549 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79544b66db-4bjg6" Apr 20 19:24:28.305921 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.305572 2580 scope.go:117] "RemoveContainer" containerID="45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec" Apr 20 19:24:28.305921 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.305556 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79544b66db-4bjg6" event={"ID":"d5d99161-eec7-4e45-bc17-4ffe78c87e59","Type":"ContainerDied","Data":"728fab4880c5d72281773487d6096246bd63adc1ad2c0b2eb718803480fa1d87"} Apr 20 19:24:28.314065 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.314037 2580 scope.go:117] "RemoveContainer" containerID="45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec" Apr 20 19:24:28.314333 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:28.314314 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec\": container with ID starting with 45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec not found: ID does not exist" containerID="45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec" Apr 20 19:24:28.314402 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.314341 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec"} err="failed to get container status \"45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec\": rpc error: code = NotFound desc = could not find container \"45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec\": container with ID starting with 45317b2aa9b362ddae9a3eeb1802439dce1b2624da62d77d1dab84265dcf4cec not found: ID does not exist" Apr 20 19:24:28.329134 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.329110 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79544b66db-4bjg6"] Apr 20 19:24:28.333589 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:28.333568 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79544b66db-4bjg6"] Apr 20 19:24:29.552753 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:29.552710 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d99161-eec7-4e45-bc17-4ffe78c87e59" path="/var/lib/kubelet/pods/d5d99161-eec7-4e45-bc17-4ffe78c87e59/volumes" Apr 20 19:24:45.435982 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:45.435931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:24:45.438569 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:45.438543 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ada6b3-5038-4d1c-bbe5-a9626c8c1987-metrics-certs\") pod \"network-metrics-daemon-mw5qh\" (UID: \"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987\") " pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:24:45.551356 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:45.551330 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gxr6p\"" Apr 20 19:24:45.559274 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:45.559245 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mw5qh" Apr 20 19:24:45.686532 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:45.686458 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mw5qh"] Apr 20 19:24:45.690153 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:24:45.690115 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ada6b3_5038_4d1c_bbe5_a9626c8c1987.slice/crio-fd4a70150493ac2bed16a0b6d04af4b5e74ef03ed6ee8e1709108737f10c120a WatchSource:0}: Error finding container fd4a70150493ac2bed16a0b6d04af4b5e74ef03ed6ee8e1709108737f10c120a: Status 404 returned error can't find the container with id fd4a70150493ac2bed16a0b6d04af4b5e74ef03ed6ee8e1709108737f10c120a Apr 20 19:24:46.366615 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:46.366571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mw5qh" event={"ID":"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987","Type":"ContainerStarted","Data":"fd4a70150493ac2bed16a0b6d04af4b5e74ef03ed6ee8e1709108737f10c120a"} Apr 20 19:24:47.371728 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.371695 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mw5qh" event={"ID":"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987","Type":"ContainerStarted","Data":"1ec0c97792810f7db47e80073c1fce1a1c42c454961f8bf4e571def8a5d52abe"} Apr 20 19:24:47.371728 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.371729 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mw5qh" event={"ID":"a8ada6b3-5038-4d1c-bbe5-a9626c8c1987","Type":"ContainerStarted","Data":"46aa51a760f9d8749dbcaa48d9bae53ada2695fa9a960bcfb1307ef447730c83"} Apr 20 19:24:47.388554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.388498 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mw5qh" podStartSLOduration=253.190769506 podStartE2EDuration="4m14.388454101s" podCreationTimestamp="2026-04-20 19:20:33 +0000 UTC" firstStartedPulling="2026-04-20 19:24:45.692039031 +0000 UTC m=+252.847588832" lastFinishedPulling="2026-04-20 19:24:46.889723621 +0000 UTC m=+254.045273427" observedRunningTime="2026-04-20 19:24:47.387463456 +0000 UTC m=+254.543013290" watchObservedRunningTime="2026-04-20 19:24:47.388454101 +0000 UTC m=+254.544003920" Apr 20 19:24:47.416535 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.416500 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:24:47.417115 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.417059 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="alertmanager" containerID="cri-o://112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408" gracePeriod=120 Apr 20 19:24:47.417586 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.417555 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy-web" containerID="cri-o://b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9" gracePeriod=120 Apr 20 19:24:47.417586 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.417547 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy" containerID="cri-o://93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d" gracePeriod=120 Apr 20 19:24:47.417780 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.417651 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="prom-label-proxy" containerID="cri-o://2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37" gracePeriod=120 Apr 20 19:24:47.417780 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.417713 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy-metric" containerID="cri-o://e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d" gracePeriod=120 Apr 20 19:24:47.417890 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:47.417703 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="config-reloader" containerID="cri-o://22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165" gracePeriod=120 Apr 20 19:24:48.378531 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.378499 2580 generic.go:358] "Generic (PLEG): container finished" podID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerID="2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37" exitCode=0 Apr 20 19:24:48.378531 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.378525 2580 generic.go:358] "Generic (PLEG): container finished" podID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerID="93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d" exitCode=0 Apr 20 19:24:48.378531 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.378532 2580 generic.go:358] "Generic (PLEG): container finished" podID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerID="22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165" exitCode=0 Apr 20 19:24:48.378531 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.378538 2580 generic.go:358] "Generic (PLEG): container finished" podID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerID="112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408" exitCode=0 Apr 20 19:24:48.379007 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.378576 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerDied","Data":"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37"} Apr 20 19:24:48.379007 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.378610 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerDied","Data":"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d"} Apr 20 19:24:48.379007 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.378621 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerDied","Data":"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165"} Apr 20 19:24:48.379007 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.378630 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerDied","Data":"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408"} Apr 20 19:24:48.687776 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.687752 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:48.763311 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763240 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-metrics-client-ca\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763488 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763321 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-web\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763488 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763360 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-web-config\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763488 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763378 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-tls-assets\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763488 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763418 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-main-tls\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763488 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763449 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-cluster-tls-config\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763505 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763534 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-metric\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763561 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-trusted-ca-bundle\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763594 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-main-db\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763607 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:48.763746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763621 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-out\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763695 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-volume\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.763746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.763734 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-782jb\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-kube-api-access-782jb\") pod \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\" (UID: \"1c58cdd6-e375-4b85-80bc-e01fbad7f866\") " Apr 20 19:24:48.764159 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.764049 2580 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-metrics-client-ca\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.765070 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.764780 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:24:48.765070 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.765038 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:24:48.766889 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.766852 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-out" (OuterVolumeSpecName: "config-out") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:24:48.767263 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.767211 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:24:48.767554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.767386 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:48.767554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.767523 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:48.767681 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.767585 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:48.767736 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.767703 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-kube-api-access-782jb" (OuterVolumeSpecName: "kube-api-access-782jb") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "kube-api-access-782jb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:24:48.767956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.767939 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:48.768192 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.768167 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:48.771487 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.771455 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:48.779175 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.779148 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-web-config" (OuterVolumeSpecName: "web-config") pod "1c58cdd6-e375-4b85-80bc-e01fbad7f866" (UID: "1c58cdd6-e375-4b85-80bc-e01fbad7f866"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:24:48.865361 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865324 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-web-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865361 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865356 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-tls-assets\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865361 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865369 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-main-tls\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865383 2580 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-cluster-tls-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865398 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865412 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865425 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865437 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-alertmanager-main-db\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865448 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-out\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865459 2580 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-config-volume\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865470 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-782jb\" (UniqueName: \"kubernetes.io/projected/1c58cdd6-e375-4b85-80bc-e01fbad7f866-kube-api-access-782jb\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:48.865614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:48.865481 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1c58cdd6-e375-4b85-80bc-e01fbad7f866-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:24:49.384291 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.384238 2580 generic.go:358] "Generic (PLEG): container finished" podID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerID="e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d" exitCode=0 Apr 20 19:24:49.384291 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.384289 2580 generic.go:358] "Generic (PLEG): container finished" podID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerID="b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9" exitCode=0 Apr 20 19:24:49.384798 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.384329 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerDied","Data":"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d"} Apr 20 19:24:49.384798 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.384366 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.384798 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.384376 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerDied","Data":"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9"} Apr 20 19:24:49.384798 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.384394 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1c58cdd6-e375-4b85-80bc-e01fbad7f866","Type":"ContainerDied","Data":"f7c7f9894bd805168d27adcd514e76007dcb193e1282b742a37445dc8f6c749c"} Apr 20 19:24:49.384798 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.384414 2580 scope.go:117] "RemoveContainer" containerID="2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37" Apr 20 19:24:49.392428 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.392290 2580 scope.go:117] "RemoveContainer" containerID="e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d" Apr 20 19:24:49.399125 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.399107 2580 scope.go:117] "RemoveContainer" containerID="93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d" Apr 20 19:24:49.405855 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.405835 2580 scope.go:117] "RemoveContainer" containerID="b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9" Apr 20 19:24:49.407939 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.407917 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:24:49.411735 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.411715 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:24:49.413914 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.413896 2580 scope.go:117] "RemoveContainer" containerID="22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165" Apr 20 19:24:49.420440 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.420422 2580 scope.go:117] "RemoveContainer" containerID="112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408" Apr 20 19:24:49.426958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.426942 2580 scope.go:117] "RemoveContainer" containerID="c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2" Apr 20 19:24:49.435087 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.435066 2580 scope.go:117] "RemoveContainer" containerID="2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37" Apr 20 19:24:49.435366 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:49.435347 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37\": container with ID starting with 2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37 not found: ID does not exist" containerID="2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37" Apr 20 19:24:49.435423 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.435375 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37"} err="failed to get container status \"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37\": rpc error: code = NotFound desc = could not find container \"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37\": container with ID starting with 2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37 not found: ID does not exist" Apr 20 19:24:49.435423 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.435393 2580 scope.go:117] "RemoveContainer" containerID="e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d" Apr 20 19:24:49.435603 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:49.435589 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d\": container with ID starting with e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d not found: ID does not exist" containerID="e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d" Apr 20 19:24:49.435642 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.435608 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d"} err="failed to get container status \"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d\": rpc error: code = NotFound desc = could not find container \"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d\": container with ID starting with e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d not found: ID does not exist" Apr 20 19:24:49.435642 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.435621 2580 scope.go:117] "RemoveContainer" containerID="93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d" Apr 20 19:24:49.435821 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:49.435803 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d\": container with ID starting with 93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d not found: ID does not exist" containerID="93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d" Apr 20 19:24:49.435859 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.435830 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d"} err="failed to get container status \"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d\": rpc error: code = NotFound desc = could not find container \"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d\": container with ID starting with 93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d not found: ID does not exist" Apr 20 19:24:49.435859 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.435847 2580 scope.go:117] "RemoveContainer" containerID="b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9" Apr 20 19:24:49.436090 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:49.436074 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9\": container with ID starting with b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9 not found: ID does not exist" containerID="b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9" Apr 20 19:24:49.436138 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436092 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9"} err="failed to get container status \"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9\": rpc error: code = NotFound desc = could not find container \"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9\": container with ID starting with b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9 not found: ID does not exist" Apr 20 19:24:49.436138 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436105 2580 scope.go:117] "RemoveContainer" containerID="22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165" Apr 20 19:24:49.436335 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:49.436320 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165\": container with ID starting with 22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165 not found: ID does not exist" containerID="22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165" Apr 20 19:24:49.436390 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436338 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165"} err="failed to get container status \"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165\": rpc error: code = NotFound desc = could not find container \"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165\": container with ID starting with 22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165 not found: ID does not exist" Apr 20 19:24:49.436390 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436351 2580 scope.go:117] "RemoveContainer" containerID="112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408" Apr 20 19:24:49.436533 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:49.436517 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408\": container with ID starting with 112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408 not found: ID does not exist" containerID="112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408" Apr 20 19:24:49.436576 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436537 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408"} err="failed to get container status \"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408\": rpc error: code = NotFound desc = could not find container \"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408\": container with ID starting with 112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408 not found: ID does not exist" Apr 20 19:24:49.436576 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436550 2580 scope.go:117] "RemoveContainer" containerID="c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2" Apr 20 19:24:49.436756 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:24:49.436742 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2\": container with ID starting with c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2 not found: ID does not exist" containerID="c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2" Apr 20 19:24:49.436796 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436760 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2"} err="failed to get container status \"c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2\": rpc error: code = NotFound desc = could not find container \"c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2\": container with ID starting with c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2 not found: ID does not exist" Apr 20 19:24:49.436796 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436773 2580 scope.go:117] "RemoveContainer" containerID="2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37" Apr 20 19:24:49.436966 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436948 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37"} err="failed to get container status \"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37\": rpc error: code = NotFound desc = could not find container \"2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37\": container with ID starting with 2247250181b353e15259262ccc83c72d188f57b6b24a227d4121d99311f19f37 not found: ID does not exist" Apr 20 19:24:49.437005 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.436968 2580 scope.go:117] "RemoveContainer" containerID="e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d" Apr 20 19:24:49.437191 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437175 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d"} err="failed to get container status \"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d\": rpc error: code = NotFound desc = could not find container \"e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d\": container with ID starting with e414aa74e0f5f8544f13f8762badd607d84369b127ba7cc133e91a6bc7f4315d not found: ID does not exist" Apr 20 19:24:49.437267 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437192 2580 scope.go:117] "RemoveContainer" containerID="93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d" Apr 20 19:24:49.437369 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437354 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d"} err="failed to get container status \"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d\": rpc error: code = NotFound desc = could not find container \"93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d\": container with ID starting with 93b63883f192cb3a50a718cd3ce0f661ac3d00ab9e9ba9159c33983fc4c00d0d not found: ID does not exist" Apr 20 19:24:49.437429 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437370 2580 scope.go:117] "RemoveContainer" containerID="b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9" Apr 20 19:24:49.437578 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437564 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9"} err="failed to get container status \"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9\": rpc error: code = NotFound desc = could not find container \"b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9\": container with ID starting with b451684d116191c679b075cc185dc869c2fddaa426160258121b3e0b761077e9 not found: ID does not exist" Apr 20 19:24:49.437620 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437578 2580 scope.go:117] "RemoveContainer" containerID="22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165" Apr 20 19:24:49.437746 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437732 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165"} err="failed to get container status \"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165\": rpc error: code = NotFound desc = could not find container \"22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165\": container with ID starting with 22f1089118fb9e66eca274fb5dc201ea42ded96974b37a2e578fb830fa8e9165 not found: ID does not exist" Apr 20 19:24:49.437795 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437746 2580 scope.go:117] "RemoveContainer" containerID="112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408" Apr 20 19:24:49.437931 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437917 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408"} err="failed to get container status \"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408\": rpc error: code = NotFound desc = could not find container \"112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408\": container with ID starting with 112ffe5a5f660cc4900c80d252d76de34944f2edd7c1bf3a3c828beb20a72408 not found: ID does not exist" Apr 20 19:24:49.437980 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.437933 2580 scope.go:117] "RemoveContainer" containerID="c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2" Apr 20 19:24:49.438144 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.438127 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2"} err="failed to get container status \"c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2\": rpc error: code = NotFound desc = could not find container \"c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2\": container with ID starting with c9e87c014cc54bbf41918628421afb463a7c58dcae33662d6dd9e083e244b3e2 not found: ID does not exist" Apr 20 19:24:49.440818 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.440799 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:24:49.441109 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441096 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="init-config-reloader" Apr 20 19:24:49.441155 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441112 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="init-config-reloader" Apr 20 19:24:49.441155 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441121 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="alertmanager" Apr 20 19:24:49.441155 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441127 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="alertmanager" Apr 20 19:24:49.441155 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441141 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy" Apr 20 19:24:49.441155 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441148 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441159 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="config-reloader" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441164 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="config-reloader" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441174 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="prom-label-proxy" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441179 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="prom-label-proxy" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441186 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" containerName="registry" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441190 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" containerName="registry" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441196 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy-web" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441201 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy-web" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441214 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5d99161-eec7-4e45-bc17-4ffe78c87e59" containerName="console" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441221 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d99161-eec7-4e45-bc17-4ffe78c87e59" containerName="console" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441228 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy-metric" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441235 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy-metric" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441300 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5d99161-eec7-4e45-bc17-4ffe78c87e59" containerName="console" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441309 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="prom-label-proxy" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441317 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="config-reloader" Apr 20 19:24:49.441327 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441327 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy-web" Apr 20 19:24:49.441821 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441337 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy-metric" Apr 20 19:24:49.441821 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441344 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="kube-rbac-proxy" Apr 20 19:24:49.441821 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441350 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d02caf8-5ad7-4d7c-aad3-54babb0bd46b" containerName="registry" Apr 20 19:24:49.441821 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.441356 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" containerName="alertmanager" Apr 20 19:24:49.446195 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.446179 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.448365 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.448343 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 19:24:49.448451 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.448381 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 19:24:49.448523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.448502 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-b976t\"" Apr 20 19:24:49.448635 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.448584 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 19:24:49.448711 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.448653 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 19:24:49.448816 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.448798 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 19:24:49.448894 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.448826 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 19:24:49.449047 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.449030 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 19:24:49.449114 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.449060 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 19:24:49.453155 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.453128 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 19:24:49.459604 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.459584 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:24:49.551887 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.551856 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c58cdd6-e375-4b85-80bc-e01fbad7f866" path="/var/lib/kubelet/pods/1c58cdd6-e375-4b85-80bc-e01fbad7f866/volumes" Apr 20 19:24:49.571513 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571490 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571596 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571521 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fed37ca-f706-4cae-9747-83d43a58e7a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571596 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571546 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571676 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571592 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fed37ca-f706-4cae-9747-83d43a58e7a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571676 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571646 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571676 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571666 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571780 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571688 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fed37ca-f706-4cae-9747-83d43a58e7a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571780 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571713 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-web-config\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571780 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571760 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d9j5\" (UniqueName: \"kubernetes.io/projected/6fed37ca-f706-4cae-9747-83d43a58e7a7-kube-api-access-6d9j5\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571877 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571780 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fed37ca-f706-4cae-9747-83d43a58e7a7-config-out\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571877 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571802 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571877 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571824 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6fed37ca-f706-4cae-9747-83d43a58e7a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.571877 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.571847 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.672832 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.672757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-web-config\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.672832 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.672798 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6d9j5\" (UniqueName: \"kubernetes.io/projected/6fed37ca-f706-4cae-9747-83d43a58e7a7-kube-api-access-6d9j5\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.672832 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.672827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fed37ca-f706-4cae-9747-83d43a58e7a7-config-out\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673071 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.672851 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673071 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.672874 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6fed37ca-f706-4cae-9747-83d43a58e7a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673071 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.672920 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673071 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.672954 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673071 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.672986 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fed37ca-f706-4cae-9747-83d43a58e7a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673071 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.673021 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673071 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.673052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fed37ca-f706-4cae-9747-83d43a58e7a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673668 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.673085 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673668 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.673116 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673668 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.673183 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fed37ca-f706-4cae-9747-83d43a58e7a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.673668 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.673411 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/6fed37ca-f706-4cae-9747-83d43a58e7a7-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.674065 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.674037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fed37ca-f706-4cae-9747-83d43a58e7a7-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.674167 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.674143 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fed37ca-f706-4cae-9747-83d43a58e7a7-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.675943 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.675912 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6fed37ca-f706-4cae-9747-83d43a58e7a7-config-out\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.676129 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.676104 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.676194 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.676179 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6fed37ca-f706-4cae-9747-83d43a58e7a7-tls-assets\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.676660 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.676639 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.676791 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.676772 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-web-config\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.677058 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.677037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.677105 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.677083 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-config-volume\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.677139 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.677088 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.678197 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.678174 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/6fed37ca-f706-4cae-9747-83d43a58e7a7-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.680029 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.680008 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d9j5\" (UniqueName: \"kubernetes.io/projected/6fed37ca-f706-4cae-9747-83d43a58e7a7-kube-api-access-6d9j5\") pod \"alertmanager-main-0\" (UID: \"6fed37ca-f706-4cae-9747-83d43a58e7a7\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.755671 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.755631 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:24:49.885404 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:49.885369 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:24:49.889756 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:24:49.889729 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fed37ca_f706_4cae_9747_83d43a58e7a7.slice/crio-ffbf90c79fd7b3dc21b8d6d7ee66e2ca10e47b5a8b73508700186937d6f58ee7 WatchSource:0}: Error finding container ffbf90c79fd7b3dc21b8d6d7ee66e2ca10e47b5a8b73508700186937d6f58ee7: Status 404 returned error can't find the container with id ffbf90c79fd7b3dc21b8d6d7ee66e2ca10e47b5a8b73508700186937d6f58ee7 Apr 20 19:24:50.388439 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:50.388408 2580 generic.go:358] "Generic (PLEG): container finished" podID="6fed37ca-f706-4cae-9747-83d43a58e7a7" containerID="fd6d443f5985e2c399e1b9194c558bb899a2d2aeb35460733e1a878195ec29c4" exitCode=0 Apr 20 19:24:50.388843 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:50.388494 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6fed37ca-f706-4cae-9747-83d43a58e7a7","Type":"ContainerDied","Data":"fd6d443f5985e2c399e1b9194c558bb899a2d2aeb35460733e1a878195ec29c4"} Apr 20 19:24:50.388843 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:50.388526 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6fed37ca-f706-4cae-9747-83d43a58e7a7","Type":"ContainerStarted","Data":"ffbf90c79fd7b3dc21b8d6d7ee66e2ca10e47b5a8b73508700186937d6f58ee7"} Apr 20 19:24:51.395748 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.395704 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6fed37ca-f706-4cae-9747-83d43a58e7a7","Type":"ContainerStarted","Data":"f4d782592bf09615c26cd2856af475f125c647a89ad48469fd39c78a168b9edd"} Apr 20 19:24:51.395748 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.395741 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6fed37ca-f706-4cae-9747-83d43a58e7a7","Type":"ContainerStarted","Data":"93fe7851abf14e9eb76bf3f59711f7a6d1a4a8d213325a63cb7e970e8b0cfc51"} Apr 20 19:24:51.395748 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.395750 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6fed37ca-f706-4cae-9747-83d43a58e7a7","Type":"ContainerStarted","Data":"f0c795be977595284cfeabad8439fd7276d5389bc8899e09d79d598e3708134c"} Apr 20 19:24:51.396187 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.395760 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6fed37ca-f706-4cae-9747-83d43a58e7a7","Type":"ContainerStarted","Data":"c536adb0c76d216cb6bf33c9563c984923548f49da2636f450386ad58cc3edb7"} Apr 20 19:24:51.396187 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.395769 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6fed37ca-f706-4cae-9747-83d43a58e7a7","Type":"ContainerStarted","Data":"34f68acd1e0bbafb221d5319bc3e91fcbeb65fd9e0e0e673fbe13fcaf2df2a4c"} Apr 20 19:24:51.396187 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.395776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"6fed37ca-f706-4cae-9747-83d43a58e7a7","Type":"ContainerStarted","Data":"4620baecf241eae75206b0d8461798fbad80b0b7834bcf99524016477bb9cf2b"} Apr 20 19:24:51.422612 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.422557 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.42253987 podStartE2EDuration="2.42253987s" podCreationTimestamp="2026-04-20 19:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:24:51.42062865 +0000 UTC m=+258.576178454" watchObservedRunningTime="2026-04-20 19:24:51.42253987 +0000 UTC m=+258.578089689" Apr 20 19:24:51.452241 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.452195 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-79df64d499-d24hs"] Apr 20 19:24:51.455971 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.455946 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.458502 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.458480 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 19:24:51.458625 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.458525 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-xh8xv\"" Apr 20 19:24:51.458625 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.458578 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 19:24:51.458907 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.458875 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 19:24:51.459026 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.458938 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 19:24:51.459026 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.458998 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 19:24:51.463736 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.463693 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 19:24:51.471514 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.471487 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79df64d499-d24hs"] Apr 20 19:24:51.593658 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.593624 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-telemeter-client-tls\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.593658 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.593656 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-federate-client-tls\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.593873 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.593682 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kft5v\" (UniqueName: \"kubernetes.io/projected/4adea75c-b9b0-4da9-a5b9-1458233cf095-kube-api-access-kft5v\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.593873 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.593748 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-metrics-client-ca\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.593873 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.593766 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-serving-certs-ca-bundle\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.593873 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.593790 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.593873 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.593828 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-secret-telemeter-client\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.593873 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.593865 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695171 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695096 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-serving-certs-ca-bundle\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695171 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695130 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695358 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695286 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-secret-telemeter-client\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695358 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695337 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695432 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695381 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-telemeter-client-tls\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695432 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-federate-client-tls\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695535 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695443 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kft5v\" (UniqueName: \"kubernetes.io/projected/4adea75c-b9b0-4da9-a5b9-1458233cf095-kube-api-access-kft5v\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695535 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695507 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-metrics-client-ca\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.695915 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.695890 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-serving-certs-ca-bundle\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.696145 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.696077 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-telemeter-trusted-ca-bundle\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.696279 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.696216 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4adea75c-b9b0-4da9-a5b9-1458233cf095-metrics-client-ca\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.698070 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.698049 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-federate-client-tls\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.698145 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.698121 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-telemeter-client-tls\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.698426 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.698409 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.698625 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.698603 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4adea75c-b9b0-4da9-a5b9-1458233cf095-secret-telemeter-client\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.702343 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.702327 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kft5v\" (UniqueName: \"kubernetes.io/projected/4adea75c-b9b0-4da9-a5b9-1458233cf095-kube-api-access-kft5v\") pod \"telemeter-client-79df64d499-d24hs\" (UID: \"4adea75c-b9b0-4da9-a5b9-1458233cf095\") " pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.766730 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.766697 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" Apr 20 19:24:51.891843 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:51.891815 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-79df64d499-d24hs"] Apr 20 19:24:51.894677 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:24:51.894649 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4adea75c_b9b0_4da9_a5b9_1458233cf095.slice/crio-da2f2a14a2d841d416510dad79792039b471d1ddf2c82d07ef054df4f6b6b3ce WatchSource:0}: Error finding container da2f2a14a2d841d416510dad79792039b471d1ddf2c82d07ef054df4f6b6b3ce: Status 404 returned error can't find the container with id da2f2a14a2d841d416510dad79792039b471d1ddf2c82d07ef054df4f6b6b3ce Apr 20 19:24:52.400067 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:52.400022 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" event={"ID":"4adea75c-b9b0-4da9-a5b9-1458233cf095","Type":"ContainerStarted","Data":"da2f2a14a2d841d416510dad79792039b471d1ddf2c82d07ef054df4f6b6b3ce"} Apr 20 19:24:54.409546 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:54.409512 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" event={"ID":"4adea75c-b9b0-4da9-a5b9-1458233cf095","Type":"ContainerStarted","Data":"ee65a259f71bf7d740676e3d3a11a271aeb563f4db4f13165cc85318a1726f61"} Apr 20 19:24:54.409850 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:54.409556 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" event={"ID":"4adea75c-b9b0-4da9-a5b9-1458233cf095","Type":"ContainerStarted","Data":"cda5a1f7a90e7cfdf65e5c2a4034e8adf221c20c20dbfe381039856acb8cbdf5"} Apr 20 19:24:54.409850 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:54.409571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" event={"ID":"4adea75c-b9b0-4da9-a5b9-1458233cf095","Type":"ContainerStarted","Data":"1a4554b6ef143a9bf1b62123014410a017fbc40ec34c5beac6a4c0eb0446aed7"} Apr 20 19:24:54.431563 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:54.431509 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-79df64d499-d24hs" podStartSLOduration=1.143525265 podStartE2EDuration="3.431493112s" podCreationTimestamp="2026-04-20 19:24:51 +0000 UTC" firstStartedPulling="2026-04-20 19:24:51.897111056 +0000 UTC m=+259.052660857" lastFinishedPulling="2026-04-20 19:24:54.185078901 +0000 UTC m=+261.340628704" observedRunningTime="2026-04-20 19:24:54.429824259 +0000 UTC m=+261.585374101" watchObservedRunningTime="2026-04-20 19:24:54.431493112 +0000 UTC m=+261.587042941" Apr 20 19:24:55.054455 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.054423 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-658f8f576-4fx2t"] Apr 20 19:24:55.058788 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.058764 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.067427 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.067404 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-658f8f576-4fx2t"] Apr 20 19:24:55.221913 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.221874 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-trusted-ca-bundle\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.222094 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.221920 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-serving-cert\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.222094 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.221968 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-oauth-config\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.222094 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.222067 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-service-ca\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.222232 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.222098 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-oauth-serving-cert\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.222232 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.222119 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwhsq\" (UniqueName: \"kubernetes.io/projected/0a8828db-dd56-45ea-a82e-bc0514330eff-kube-api-access-qwhsq\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.222232 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.222155 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-console-config\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.322779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.322683 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-trusted-ca-bundle\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.322779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.322727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-serving-cert\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.322779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.322742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-oauth-config\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.322779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.322779 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-service-ca\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.323076 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.322813 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-oauth-serving-cert\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.323076 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.322833 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwhsq\" (UniqueName: \"kubernetes.io/projected/0a8828db-dd56-45ea-a82e-bc0514330eff-kube-api-access-qwhsq\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.323076 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.322869 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-console-config\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.323822 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.323782 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-service-ca\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.323979 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.323956 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-console-config\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.324290 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.324240 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-trusted-ca-bundle\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.324366 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.324244 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-oauth-serving-cert\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.329562 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.326384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-serving-cert\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.329562 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.326648 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-oauth-config\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.332222 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.332199 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwhsq\" (UniqueName: \"kubernetes.io/projected/0a8828db-dd56-45ea-a82e-bc0514330eff-kube-api-access-qwhsq\") pod \"console-658f8f576-4fx2t\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.369665 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.369640 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:24:55.493226 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:55.493198 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-658f8f576-4fx2t"] Apr 20 19:24:55.496211 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:24:55.496182 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a8828db_dd56_45ea_a82e_bc0514330eff.slice/crio-59b5ac4eee631fb70280e4edbbffbbe73dbb957aaf421a9cc1d85518cda7b200 WatchSource:0}: Error finding container 59b5ac4eee631fb70280e4edbbffbbe73dbb957aaf421a9cc1d85518cda7b200: Status 404 returned error can't find the container with id 59b5ac4eee631fb70280e4edbbffbbe73dbb957aaf421a9cc1d85518cda7b200 Apr 20 19:24:56.417666 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:56.417629 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-658f8f576-4fx2t" event={"ID":"0a8828db-dd56-45ea-a82e-bc0514330eff","Type":"ContainerStarted","Data":"b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871"} Apr 20 19:24:56.417666 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:56.417667 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-658f8f576-4fx2t" event={"ID":"0a8828db-dd56-45ea-a82e-bc0514330eff","Type":"ContainerStarted","Data":"59b5ac4eee631fb70280e4edbbffbbe73dbb957aaf421a9cc1d85518cda7b200"} Apr 20 19:24:56.436175 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:24:56.436116 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-658f8f576-4fx2t" podStartSLOduration=1.436098363 podStartE2EDuration="1.436098363s" podCreationTimestamp="2026-04-20 19:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:24:56.435119417 +0000 UTC m=+263.590669236" watchObservedRunningTime="2026-04-20 19:24:56.436098363 +0000 UTC m=+263.591648178" Apr 20 19:25:05.370674 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:05.370627 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:25:05.370674 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:05.370680 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:25:05.375545 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:05.375518 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:25:05.454808 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:05.454781 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:25:05.506186 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:05.506152 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-785d86dd9d-4zqct"] Apr 20 19:25:30.531211 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.531150 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-785d86dd9d-4zqct" podUID="85334d2a-c60c-4c7e-b77d-2f16011a6af7" containerName="console" containerID="cri-o://b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be" gracePeriod=15 Apr 20 19:25:30.775000 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.774978 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-785d86dd9d-4zqct_85334d2a-c60c-4c7e-b77d-2f16011a6af7/console/0.log" Apr 20 19:25:30.775118 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.775046 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:25:30.797553 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.797471 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-trusted-ca-bundle\") pod \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " Apr 20 19:25:30.797553 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.797528 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9smp\" (UniqueName: \"kubernetes.io/projected/85334d2a-c60c-4c7e-b77d-2f16011a6af7-kube-api-access-r9smp\") pod \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " Apr 20 19:25:30.797748 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.797557 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-service-ca\") pod \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " Apr 20 19:25:30.797748 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.797673 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-serving-cert\") pod \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " Apr 20 19:25:30.797748 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.797715 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-oauth-serving-cert\") pod \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " Apr 20 19:25:30.797903 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.797759 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-oauth-config\") pod \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " Apr 20 19:25:30.797903 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.797795 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-config\") pod \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\" (UID: \"85334d2a-c60c-4c7e-b77d-2f16011a6af7\") " Apr 20 19:25:30.798074 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.798046 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-service-ca" (OuterVolumeSpecName: "service-ca") pod "85334d2a-c60c-4c7e-b77d-2f16011a6af7" (UID: "85334d2a-c60c-4c7e-b77d-2f16011a6af7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:25:30.798165 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.798090 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "85334d2a-c60c-4c7e-b77d-2f16011a6af7" (UID: "85334d2a-c60c-4c7e-b77d-2f16011a6af7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:25:30.798353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.798329 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-config" (OuterVolumeSpecName: "console-config") pod "85334d2a-c60c-4c7e-b77d-2f16011a6af7" (UID: "85334d2a-c60c-4c7e-b77d-2f16011a6af7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:25:30.798640 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.798616 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "85334d2a-c60c-4c7e-b77d-2f16011a6af7" (UID: "85334d2a-c60c-4c7e-b77d-2f16011a6af7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:25:30.799950 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.799924 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85334d2a-c60c-4c7e-b77d-2f16011a6af7-kube-api-access-r9smp" (OuterVolumeSpecName: "kube-api-access-r9smp") pod "85334d2a-c60c-4c7e-b77d-2f16011a6af7" (UID: "85334d2a-c60c-4c7e-b77d-2f16011a6af7"). InnerVolumeSpecName "kube-api-access-r9smp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:25:30.800179 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.800156 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "85334d2a-c60c-4c7e-b77d-2f16011a6af7" (UID: "85334d2a-c60c-4c7e-b77d-2f16011a6af7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:25:30.800420 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.800384 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "85334d2a-c60c-4c7e-b77d-2f16011a6af7" (UID: "85334d2a-c60c-4c7e-b77d-2f16011a6af7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:25:30.898931 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.898891 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-trusted-ca-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:25:30.898931 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.898923 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r9smp\" (UniqueName: \"kubernetes.io/projected/85334d2a-c60c-4c7e-b77d-2f16011a6af7-kube-api-access-r9smp\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:25:30.898931 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.898934 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-service-ca\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:25:30.898931 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.898943 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-serving-cert\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:25:30.899195 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.898953 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-oauth-serving-cert\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:25:30.899195 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.898963 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-oauth-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:25:30.899195 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:30.898973 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85334d2a-c60c-4c7e-b77d-2f16011a6af7-console-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:25:31.529103 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.529070 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-785d86dd9d-4zqct_85334d2a-c60c-4c7e-b77d-2f16011a6af7/console/0.log" Apr 20 19:25:31.529273 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.529116 2580 generic.go:358] "Generic (PLEG): container finished" podID="85334d2a-c60c-4c7e-b77d-2f16011a6af7" containerID="b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be" exitCode=2 Apr 20 19:25:31.529273 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.529147 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785d86dd9d-4zqct" event={"ID":"85334d2a-c60c-4c7e-b77d-2f16011a6af7","Type":"ContainerDied","Data":"b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be"} Apr 20 19:25:31.529273 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.529186 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-785d86dd9d-4zqct" event={"ID":"85334d2a-c60c-4c7e-b77d-2f16011a6af7","Type":"ContainerDied","Data":"18c54f40689c76c75ed10071ad1e478c8487d4f087ff6ff88236625d6a196f2f"} Apr 20 19:25:31.529273 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.529193 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-785d86dd9d-4zqct" Apr 20 19:25:31.529273 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.529202 2580 scope.go:117] "RemoveContainer" containerID="b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be" Apr 20 19:25:31.537663 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.537480 2580 scope.go:117] "RemoveContainer" containerID="b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be" Apr 20 19:25:31.537906 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:25:31.537759 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be\": container with ID starting with b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be not found: ID does not exist" containerID="b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be" Apr 20 19:25:31.537906 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.537795 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be"} err="failed to get container status \"b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be\": rpc error: code = NotFound desc = could not find container \"b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be\": container with ID starting with b0187dedfb5af5bd5d4487cd6caeccf470988257571490f8725cf739fc7616be not found: ID does not exist" Apr 20 19:25:31.552763 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.552740 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-785d86dd9d-4zqct"] Apr 20 19:25:31.555764 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:31.555742 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-785d86dd9d-4zqct"] Apr 20 19:25:33.460511 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:33.460480 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:25:33.462810 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:33.462789 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:25:33.467672 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:33.467650 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:25:33.470291 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:33.470272 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:25:33.470732 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:33.470715 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 19:25:33.552934 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:25:33.552810 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85334d2a-c60c-4c7e-b77d-2f16011a6af7" path="/var/lib/kubelet/pods/85334d2a-c60c-4c7e-b77d-2f16011a6af7/volumes" Apr 20 19:26:11.050975 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.050931 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6689b8cd74-x72mb"] Apr 20 19:26:11.051533 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.051492 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85334d2a-c60c-4c7e-b77d-2f16011a6af7" containerName="console" Apr 20 19:26:11.051533 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.051514 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="85334d2a-c60c-4c7e-b77d-2f16011a6af7" containerName="console" Apr 20 19:26:11.051671 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.051595 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="85334d2a-c60c-4c7e-b77d-2f16011a6af7" containerName="console" Apr 20 19:26:11.055400 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.055372 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.063786 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.063499 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6689b8cd74-x72mb"] Apr 20 19:26:11.237013 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.236971 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-service-ca\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.237188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.237026 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-oauth-serving-cert\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.237188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.237046 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-serving-cert\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.237188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.237062 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-oauth-config\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.237188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.237081 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6fql\" (UniqueName: \"kubernetes.io/projected/d6aa5975-60c4-418b-baac-3401036ab231-kube-api-access-c6fql\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.237188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.237098 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-console-config\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.237188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.237136 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-trusted-ca-bundle\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.338537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.338454 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-oauth-serving-cert\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.338537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.338492 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-serving-cert\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.338537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.338510 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-oauth-config\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.338537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.338540 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6fql\" (UniqueName: \"kubernetes.io/projected/d6aa5975-60c4-418b-baac-3401036ab231-kube-api-access-c6fql\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.338880 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.338789 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-console-config\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.338880 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.338849 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-trusted-ca-bundle\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.338983 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.338931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-service-ca\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.339216 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.339189 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-oauth-serving-cert\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.339504 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.339463 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-console-config\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.339674 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.339651 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-trusted-ca-bundle\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.339898 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.339872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-service-ca\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.341296 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.341271 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-oauth-config\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.341385 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.341318 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-serving-cert\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.346120 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.346099 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6fql\" (UniqueName: \"kubernetes.io/projected/d6aa5975-60c4-418b-baac-3401036ab231-kube-api-access-c6fql\") pod \"console-6689b8cd74-x72mb\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.367174 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.367151 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:11.494719 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.494687 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6689b8cd74-x72mb"] Apr 20 19:26:11.498077 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:26:11.498046 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6aa5975_60c4_418b_baac_3401036ab231.slice/crio-abd7442d8acf1cc684abbe013f18785951660f2841e9ce5d97218a9a11c09a34 WatchSource:0}: Error finding container abd7442d8acf1cc684abbe013f18785951660f2841e9ce5d97218a9a11c09a34: Status 404 returned error can't find the container with id abd7442d8acf1cc684abbe013f18785951660f2841e9ce5d97218a9a11c09a34 Apr 20 19:26:11.500303 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.500286 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:26:11.647204 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.647121 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6689b8cd74-x72mb" event={"ID":"d6aa5975-60c4-418b-baac-3401036ab231","Type":"ContainerStarted","Data":"62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d"} Apr 20 19:26:11.647204 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.647156 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6689b8cd74-x72mb" event={"ID":"d6aa5975-60c4-418b-baac-3401036ab231","Type":"ContainerStarted","Data":"abd7442d8acf1cc684abbe013f18785951660f2841e9ce5d97218a9a11c09a34"} Apr 20 19:26:11.663450 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:11.663396 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6689b8cd74-x72mb" podStartSLOduration=0.663378636 podStartE2EDuration="663.378636ms" podCreationTimestamp="2026-04-20 19:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:26:11.662326294 +0000 UTC m=+338.817876113" watchObservedRunningTime="2026-04-20 19:26:11.663378636 +0000 UTC m=+338.818928457" Apr 20 19:26:21.367599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:21.367525 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:21.367599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:21.367561 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:21.372386 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:21.372365 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:21.681602 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:21.681515 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:26:21.726847 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:21.726813 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-658f8f576-4fx2t"] Apr 20 19:26:46.749715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:46.749649 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-658f8f576-4fx2t" podUID="0a8828db-dd56-45ea-a82e-bc0514330eff" containerName="console" containerID="cri-o://b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871" gracePeriod=15 Apr 20 19:26:46.990960 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:46.990935 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-658f8f576-4fx2t_0a8828db-dd56-45ea-a82e-bc0514330eff/console/0.log" Apr 20 19:26:46.991093 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:46.991008 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:26:47.143712 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.143673 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-trusted-ca-bundle\") pod \"0a8828db-dd56-45ea-a82e-bc0514330eff\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " Apr 20 19:26:47.143712 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.143722 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-oauth-config\") pod \"0a8828db-dd56-45ea-a82e-bc0514330eff\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " Apr 20 19:26:47.143978 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.143740 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-service-ca\") pod \"0a8828db-dd56-45ea-a82e-bc0514330eff\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " Apr 20 19:26:47.143978 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.143808 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-serving-cert\") pod \"0a8828db-dd56-45ea-a82e-bc0514330eff\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " Apr 20 19:26:47.143978 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.143865 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-console-config\") pod \"0a8828db-dd56-45ea-a82e-bc0514330eff\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " Apr 20 19:26:47.143978 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.143898 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwhsq\" (UniqueName: \"kubernetes.io/projected/0a8828db-dd56-45ea-a82e-bc0514330eff-kube-api-access-qwhsq\") pod \"0a8828db-dd56-45ea-a82e-bc0514330eff\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " Apr 20 19:26:47.143978 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.143923 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-oauth-serving-cert\") pod \"0a8828db-dd56-45ea-a82e-bc0514330eff\" (UID: \"0a8828db-dd56-45ea-a82e-bc0514330eff\") " Apr 20 19:26:47.144232 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.144165 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0a8828db-dd56-45ea-a82e-bc0514330eff" (UID: "0a8828db-dd56-45ea-a82e-bc0514330eff"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:26:47.144232 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.144163 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-service-ca" (OuterVolumeSpecName: "service-ca") pod "0a8828db-dd56-45ea-a82e-bc0514330eff" (UID: "0a8828db-dd56-45ea-a82e-bc0514330eff"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:26:47.144520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.144492 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0a8828db-dd56-45ea-a82e-bc0514330eff" (UID: "0a8828db-dd56-45ea-a82e-bc0514330eff"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:26:47.144520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.144501 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-console-config" (OuterVolumeSpecName: "console-config") pod "0a8828db-dd56-45ea-a82e-bc0514330eff" (UID: "0a8828db-dd56-45ea-a82e-bc0514330eff"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:26:47.146204 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.146178 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0a8828db-dd56-45ea-a82e-bc0514330eff" (UID: "0a8828db-dd56-45ea-a82e-bc0514330eff"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:26:47.146744 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.146725 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0a8828db-dd56-45ea-a82e-bc0514330eff" (UID: "0a8828db-dd56-45ea-a82e-bc0514330eff"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:26:47.146796 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.146750 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8828db-dd56-45ea-a82e-bc0514330eff-kube-api-access-qwhsq" (OuterVolumeSpecName: "kube-api-access-qwhsq") pod "0a8828db-dd56-45ea-a82e-bc0514330eff" (UID: "0a8828db-dd56-45ea-a82e-bc0514330eff"). InnerVolumeSpecName "kube-api-access-qwhsq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:26:47.245312 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.245279 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-trusted-ca-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:26:47.245312 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.245306 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-oauth-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:26:47.245312 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.245316 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-service-ca\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:26:47.245547 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.245328 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a8828db-dd56-45ea-a82e-bc0514330eff-console-serving-cert\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:26:47.245547 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.245338 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-console-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:26:47.245547 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.245347 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwhsq\" (UniqueName: \"kubernetes.io/projected/0a8828db-dd56-45ea-a82e-bc0514330eff-kube-api-access-qwhsq\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:26:47.245547 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.245355 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a8828db-dd56-45ea-a82e-bc0514330eff-oauth-serving-cert\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:26:47.762937 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.762910 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-658f8f576-4fx2t_0a8828db-dd56-45ea-a82e-bc0514330eff/console/0.log" Apr 20 19:26:47.763366 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.762950 2580 generic.go:358] "Generic (PLEG): container finished" podID="0a8828db-dd56-45ea-a82e-bc0514330eff" containerID="b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871" exitCode=2 Apr 20 19:26:47.763366 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.762985 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-658f8f576-4fx2t" event={"ID":"0a8828db-dd56-45ea-a82e-bc0514330eff","Type":"ContainerDied","Data":"b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871"} Apr 20 19:26:47.763366 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.763015 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-658f8f576-4fx2t" Apr 20 19:26:47.763366 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.763023 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-658f8f576-4fx2t" event={"ID":"0a8828db-dd56-45ea-a82e-bc0514330eff","Type":"ContainerDied","Data":"59b5ac4eee631fb70280e4edbbffbbe73dbb957aaf421a9cc1d85518cda7b200"} Apr 20 19:26:47.763366 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.763040 2580 scope.go:117] "RemoveContainer" containerID="b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871" Apr 20 19:26:47.771702 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.771690 2580 scope.go:117] "RemoveContainer" containerID="b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871" Apr 20 19:26:47.771973 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:26:47.771956 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871\": container with ID starting with b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871 not found: ID does not exist" containerID="b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871" Apr 20 19:26:47.772057 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.771979 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871"} err="failed to get container status \"b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871\": rpc error: code = NotFound desc = could not find container \"b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871\": container with ID starting with b271691c1e819ccb1594c3aadb108a5576b4b8b06d2833f56b381e26f15f6871 not found: ID does not exist" Apr 20 19:26:47.780765 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.780732 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-658f8f576-4fx2t"] Apr 20 19:26:47.784508 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:47.784484 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-658f8f576-4fx2t"] Apr 20 19:26:49.552726 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:26:49.552696 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8828db-dd56-45ea-a82e-bc0514330eff" path="/var/lib/kubelet/pods/0a8828db-dd56-45ea-a82e-bc0514330eff/volumes" Apr 20 19:27:12.520733 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.520699 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj"] Apr 20 19:27:12.521145 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.521024 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a8828db-dd56-45ea-a82e-bc0514330eff" containerName="console" Apr 20 19:27:12.521145 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.521035 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8828db-dd56-45ea-a82e-bc0514330eff" containerName="console" Apr 20 19:27:12.521145 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.521089 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a8828db-dd56-45ea-a82e-bc0514330eff" containerName="console" Apr 20 19:27:12.524317 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.524287 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.527472 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.527451 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rbrg7\"" Apr 20 19:27:12.527599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.527494 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:27:12.527991 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.527968 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:27:12.534104 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.534084 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj"] Apr 20 19:27:12.552862 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.552840 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.552981 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.552876 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.552981 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.552896 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t874\" (UniqueName: \"kubernetes.io/projected/5aa4f830-199f-43b5-affc-fab75d509c35-kube-api-access-8t874\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.653697 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.653648 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.653697 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.653706 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.653972 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.653729 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t874\" (UniqueName: \"kubernetes.io/projected/5aa4f830-199f-43b5-affc-fab75d509c35-kube-api-access-8t874\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.654049 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.654027 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.654109 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.654087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.662719 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.662695 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t874\" (UniqueName: \"kubernetes.io/projected/5aa4f830-199f-43b5-affc-fab75d509c35-kube-api-access-8t874\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.834530 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.834442 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:12.964210 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:12.964186 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj"] Apr 20 19:27:12.966821 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:27:12.966793 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5aa4f830_199f_43b5_affc_fab75d509c35.slice/crio-aa5ef1c210cb24da4f8b983aed7e0f05c764f5a7c08cd63501165bf1c17a33f4 WatchSource:0}: Error finding container aa5ef1c210cb24da4f8b983aed7e0f05c764f5a7c08cd63501165bf1c17a33f4: Status 404 returned error can't find the container with id aa5ef1c210cb24da4f8b983aed7e0f05c764f5a7c08cd63501165bf1c17a33f4 Apr 20 19:27:13.843143 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:13.843108 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" event={"ID":"5aa4f830-199f-43b5-affc-fab75d509c35","Type":"ContainerStarted","Data":"aa5ef1c210cb24da4f8b983aed7e0f05c764f5a7c08cd63501165bf1c17a33f4"} Apr 20 19:27:19.866414 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:19.866377 2580 generic.go:358] "Generic (PLEG): container finished" podID="5aa4f830-199f-43b5-affc-fab75d509c35" containerID="3610a9b7b9e6a13c2a82b584dd3b6a80549028b3178cd78cb1a4ca49c3b58cf8" exitCode=0 Apr 20 19:27:19.866800 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:19.866472 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" event={"ID":"5aa4f830-199f-43b5-affc-fab75d509c35","Type":"ContainerDied","Data":"3610a9b7b9e6a13c2a82b584dd3b6a80549028b3178cd78cb1a4ca49c3b58cf8"} Apr 20 19:27:22.877361 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:22.877322 2580 generic.go:358] "Generic (PLEG): container finished" podID="5aa4f830-199f-43b5-affc-fab75d509c35" containerID="ed4941a14253d67bcce38b9fc608245345a95d7a10beac547030a3e9fbab5957" exitCode=0 Apr 20 19:27:22.877837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:22.877408 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" event={"ID":"5aa4f830-199f-43b5-affc-fab75d509c35","Type":"ContainerDied","Data":"ed4941a14253d67bcce38b9fc608245345a95d7a10beac547030a3e9fbab5957"} Apr 20 19:27:29.903719 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:29.903691 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" event={"ID":"5aa4f830-199f-43b5-affc-fab75d509c35","Type":"ContainerStarted","Data":"ffa8a2f67110529c8491195ac7e9a0231fad1556f1720d8d654d7a6db9a981d2"} Apr 20 19:27:29.919452 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:29.919356 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" podStartSLOduration=1.053936162 podStartE2EDuration="17.919339594s" podCreationTimestamp="2026-04-20 19:27:12 +0000 UTC" firstStartedPulling="2026-04-20 19:27:12.96873817 +0000 UTC m=+400.124287970" lastFinishedPulling="2026-04-20 19:27:29.834141605 +0000 UTC m=+416.989691402" observedRunningTime="2026-04-20 19:27:29.918897197 +0000 UTC m=+417.074447016" watchObservedRunningTime="2026-04-20 19:27:29.919339594 +0000 UTC m=+417.074889408" Apr 20 19:27:30.908150 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:30.908109 2580 generic.go:358] "Generic (PLEG): container finished" podID="5aa4f830-199f-43b5-affc-fab75d509c35" containerID="ffa8a2f67110529c8491195ac7e9a0231fad1556f1720d8d654d7a6db9a981d2" exitCode=0 Apr 20 19:27:30.908574 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:30.908169 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" event={"ID":"5aa4f830-199f-43b5-affc-fab75d509c35","Type":"ContainerDied","Data":"ffa8a2f67110529c8491195ac7e9a0231fad1556f1720d8d654d7a6db9a981d2"} Apr 20 19:27:32.040031 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.040005 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:32.135053 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.135014 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-bundle\") pod \"5aa4f830-199f-43b5-affc-fab75d509c35\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " Apr 20 19:27:32.135217 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.135077 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-util\") pod \"5aa4f830-199f-43b5-affc-fab75d509c35\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " Apr 20 19:27:32.135217 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.135121 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t874\" (UniqueName: \"kubernetes.io/projected/5aa4f830-199f-43b5-affc-fab75d509c35-kube-api-access-8t874\") pod \"5aa4f830-199f-43b5-affc-fab75d509c35\" (UID: \"5aa4f830-199f-43b5-affc-fab75d509c35\") " Apr 20 19:27:32.135666 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.135642 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-bundle" (OuterVolumeSpecName: "bundle") pod "5aa4f830-199f-43b5-affc-fab75d509c35" (UID: "5aa4f830-199f-43b5-affc-fab75d509c35"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:27:32.137515 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.137493 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa4f830-199f-43b5-affc-fab75d509c35-kube-api-access-8t874" (OuterVolumeSpecName: "kube-api-access-8t874") pod "5aa4f830-199f-43b5-affc-fab75d509c35" (UID: "5aa4f830-199f-43b5-affc-fab75d509c35"). InnerVolumeSpecName "kube-api-access-8t874". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:27:32.140032 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.140005 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-util" (OuterVolumeSpecName: "util") pod "5aa4f830-199f-43b5-affc-fab75d509c35" (UID: "5aa4f830-199f-43b5-affc-fab75d509c35"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:27:32.236731 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.236644 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8t874\" (UniqueName: \"kubernetes.io/projected/5aa4f830-199f-43b5-affc-fab75d509c35-kube-api-access-8t874\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:27:32.236731 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.236676 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:27:32.236731 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.236686 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5aa4f830-199f-43b5-affc-fab75d509c35-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:27:32.915956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.915925 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" Apr 20 19:27:32.915956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.915931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dg8rtj" event={"ID":"5aa4f830-199f-43b5-affc-fab75d509c35","Type":"ContainerDied","Data":"aa5ef1c210cb24da4f8b983aed7e0f05c764f5a7c08cd63501165bf1c17a33f4"} Apr 20 19:27:32.915956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:32.915963 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa5ef1c210cb24da4f8b983aed7e0f05c764f5a7c08cd63501165bf1c17a33f4" Apr 20 19:27:40.507513 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.507473 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7"] Apr 20 19:27:40.508044 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.507899 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aa4f830-199f-43b5-affc-fab75d509c35" containerName="extract" Apr 20 19:27:40.508044 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.507913 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa4f830-199f-43b5-affc-fab75d509c35" containerName="extract" Apr 20 19:27:40.508044 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.507932 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aa4f830-199f-43b5-affc-fab75d509c35" containerName="util" Apr 20 19:27:40.508044 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.507938 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa4f830-199f-43b5-affc-fab75d509c35" containerName="util" Apr 20 19:27:40.508044 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.507952 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5aa4f830-199f-43b5-affc-fab75d509c35" containerName="pull" Apr 20 19:27:40.508044 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.507957 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa4f830-199f-43b5-affc-fab75d509c35" containerName="pull" Apr 20 19:27:40.508044 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.508022 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="5aa4f830-199f-43b5-affc-fab75d509c35" containerName="extract" Apr 20 19:27:40.512296 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.512280 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" Apr 20 19:27:40.514653 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.514625 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:27:40.514787 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.514665 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 19:27:40.514787 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.514624 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-4mc7r\"" Apr 20 19:27:40.524578 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.524548 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7"] Apr 20 19:27:40.609274 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.609199 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99ab14f2-2fcc-4f26-adcc-eae3740ef97a-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-ggmz7\" (UID: \"99ab14f2-2fcc-4f26-adcc-eae3740ef97a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" Apr 20 19:27:40.609475 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.609355 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7vmq\" (UniqueName: \"kubernetes.io/projected/99ab14f2-2fcc-4f26-adcc-eae3740ef97a-kube-api-access-q7vmq\") pod \"cert-manager-operator-controller-manager-54b9655956-ggmz7\" (UID: \"99ab14f2-2fcc-4f26-adcc-eae3740ef97a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" Apr 20 19:27:40.710628 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.710594 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7vmq\" (UniqueName: \"kubernetes.io/projected/99ab14f2-2fcc-4f26-adcc-eae3740ef97a-kube-api-access-q7vmq\") pod \"cert-manager-operator-controller-manager-54b9655956-ggmz7\" (UID: \"99ab14f2-2fcc-4f26-adcc-eae3740ef97a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" Apr 20 19:27:40.710779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.710648 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99ab14f2-2fcc-4f26-adcc-eae3740ef97a-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-ggmz7\" (UID: \"99ab14f2-2fcc-4f26-adcc-eae3740ef97a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" Apr 20 19:27:40.711038 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.711015 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/99ab14f2-2fcc-4f26-adcc-eae3740ef97a-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-ggmz7\" (UID: \"99ab14f2-2fcc-4f26-adcc-eae3740ef97a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" Apr 20 19:27:40.718625 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.718601 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7vmq\" (UniqueName: \"kubernetes.io/projected/99ab14f2-2fcc-4f26-adcc-eae3740ef97a-kube-api-access-q7vmq\") pod \"cert-manager-operator-controller-manager-54b9655956-ggmz7\" (UID: \"99ab14f2-2fcc-4f26-adcc-eae3740ef97a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" Apr 20 19:27:40.821108 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.821064 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" Apr 20 19:27:40.952060 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:40.952030 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7"] Apr 20 19:27:40.955640 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:27:40.955611 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ab14f2_2fcc_4f26_adcc_eae3740ef97a.slice/crio-2cec587215e78ee7df6e1d04d491e602b8a4eb33e541fdabebfef770af832168 WatchSource:0}: Error finding container 2cec587215e78ee7df6e1d04d491e602b8a4eb33e541fdabebfef770af832168: Status 404 returned error can't find the container with id 2cec587215e78ee7df6e1d04d491e602b8a4eb33e541fdabebfef770af832168 Apr 20 19:27:41.947333 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:41.947287 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" event={"ID":"99ab14f2-2fcc-4f26-adcc-eae3740ef97a","Type":"ContainerStarted","Data":"2cec587215e78ee7df6e1d04d491e602b8a4eb33e541fdabebfef770af832168"} Apr 20 19:27:42.955051 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:42.954966 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" event={"ID":"99ab14f2-2fcc-4f26-adcc-eae3740ef97a","Type":"ContainerStarted","Data":"586f7a1dccb41bc04a1c973e19d59fc0a533a2f6bf4910028e182a2fba6e843f"} Apr 20 19:27:42.977353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:42.977304 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-ggmz7" podStartSLOduration=1.364572532 podStartE2EDuration="2.977289994s" podCreationTimestamp="2026-04-20 19:27:40 +0000 UTC" firstStartedPulling="2026-04-20 19:27:40.958321694 +0000 UTC m=+428.113871495" lastFinishedPulling="2026-04-20 19:27:42.57103916 +0000 UTC m=+429.726588957" observedRunningTime="2026-04-20 19:27:42.976372072 +0000 UTC m=+430.131921913" watchObservedRunningTime="2026-04-20 19:27:42.977289994 +0000 UTC m=+430.132839812" Apr 20 19:27:44.194506 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.194468 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf"] Apr 20 19:27:44.196901 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.196882 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.199846 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.199826 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rbrg7\"" Apr 20 19:27:44.199960 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.199878 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:27:44.200482 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.200466 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:27:44.206535 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.206516 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf"] Apr 20 19:27:44.342503 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.342466 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.342676 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.342533 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.342676 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.342573 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk9h2\" (UniqueName: \"kubernetes.io/projected/ee7ed166-6efe-4c5e-a146-b70feb6121ef-kube-api-access-mk9h2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.443821 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.443791 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.443994 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.443833 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk9h2\" (UniqueName: \"kubernetes.io/projected/ee7ed166-6efe-4c5e-a146-b70feb6121ef-kube-api-access-mk9h2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.443994 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.443869 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.444269 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.444226 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.444321 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.444270 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.454747 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.454678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk9h2\" (UniqueName: \"kubernetes.io/projected/ee7ed166-6efe-4c5e-a146-b70feb6121ef-kube-api-access-mk9h2\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.506089 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.506061 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:44.636684 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.636653 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf"] Apr 20 19:27:44.640173 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:27:44.640144 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee7ed166_6efe_4c5e_a146_b70feb6121ef.slice/crio-b80dc3713a0997a2ddb6f469adbcd243f3527b17392787aa639dfb9806553491 WatchSource:0}: Error finding container b80dc3713a0997a2ddb6f469adbcd243f3527b17392787aa639dfb9806553491: Status 404 returned error can't find the container with id b80dc3713a0997a2ddb6f469adbcd243f3527b17392787aa639dfb9806553491 Apr 20 19:27:44.963330 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.963295 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerID="f2484848eb09e356f3973ed4c64d047ab5c1451123556d34de0942a2e8c468e4" exitCode=0 Apr 20 19:27:44.963509 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.963369 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" event={"ID":"ee7ed166-6efe-4c5e-a146-b70feb6121ef","Type":"ContainerDied","Data":"f2484848eb09e356f3973ed4c64d047ab5c1451123556d34de0942a2e8c468e4"} Apr 20 19:27:44.963509 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:44.963391 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" event={"ID":"ee7ed166-6efe-4c5e-a146-b70feb6121ef","Type":"ContainerStarted","Data":"b80dc3713a0997a2ddb6f469adbcd243f3527b17392787aa639dfb9806553491"} Apr 20 19:27:49.986545 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:49.986458 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerID="8aa7d8dc751f6f9216c16ed12a7567d3ef6e442af3b2b518cbdfad65b9c64090" exitCode=0 Apr 20 19:27:49.986545 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:49.986508 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" event={"ID":"ee7ed166-6efe-4c5e-a146-b70feb6121ef","Type":"ContainerDied","Data":"8aa7d8dc751f6f9216c16ed12a7567d3ef6e442af3b2b518cbdfad65b9c64090"} Apr 20 19:27:50.991451 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:50.991415 2580 generic.go:358] "Generic (PLEG): container finished" podID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerID="c0082135f92636b7dfac1470b6bf593005e5992d22876ef76032d5eee539be92" exitCode=0 Apr 20 19:27:50.991888 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:50.991508 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" event={"ID":"ee7ed166-6efe-4c5e-a146-b70feb6121ef","Type":"ContainerDied","Data":"c0082135f92636b7dfac1470b6bf593005e5992d22876ef76032d5eee539be92"} Apr 20 19:27:52.114316 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.114294 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:52.214835 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.214800 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-bundle\") pod \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " Apr 20 19:27:52.214968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.214856 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk9h2\" (UniqueName: \"kubernetes.io/projected/ee7ed166-6efe-4c5e-a146-b70feb6121ef-kube-api-access-mk9h2\") pod \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " Apr 20 19:27:52.214968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.214927 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-util\") pod \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\" (UID: \"ee7ed166-6efe-4c5e-a146-b70feb6121ef\") " Apr 20 19:27:52.215222 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.215198 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-bundle" (OuterVolumeSpecName: "bundle") pod "ee7ed166-6efe-4c5e-a146-b70feb6121ef" (UID: "ee7ed166-6efe-4c5e-a146-b70feb6121ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:27:52.217108 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.217079 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7ed166-6efe-4c5e-a146-b70feb6121ef-kube-api-access-mk9h2" (OuterVolumeSpecName: "kube-api-access-mk9h2") pod "ee7ed166-6efe-4c5e-a146-b70feb6121ef" (UID: "ee7ed166-6efe-4c5e-a146-b70feb6121ef"). InnerVolumeSpecName "kube-api-access-mk9h2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:27:52.219140 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.219120 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-util" (OuterVolumeSpecName: "util") pod "ee7ed166-6efe-4c5e-a146-b70feb6121ef" (UID: "ee7ed166-6efe-4c5e-a146-b70feb6121ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:27:52.315486 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.315461 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mk9h2\" (UniqueName: \"kubernetes.io/projected/ee7ed166-6efe-4c5e-a146-b70feb6121ef-kube-api-access-mk9h2\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:27:52.315486 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.315484 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:27:52.315645 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.315494 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee7ed166-6efe-4c5e-a146-b70feb6121ef-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:27:52.999536 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.999499 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" event={"ID":"ee7ed166-6efe-4c5e-a146-b70feb6121ef","Type":"ContainerDied","Data":"b80dc3713a0997a2ddb6f469adbcd243f3527b17392787aa639dfb9806553491"} Apr 20 19:27:52.999536 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.999534 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b80dc3713a0997a2ddb6f469adbcd243f3527b17392787aa639dfb9806553491" Apr 20 19:27:52.999738 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:52.999552 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fg2hmf" Apr 20 19:27:55.023363 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.023322 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-qrg4v"] Apr 20 19:27:55.023732 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.023690 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerName="pull" Apr 20 19:27:55.023732 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.023702 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerName="pull" Apr 20 19:27:55.023732 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.023718 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerName="util" Apr 20 19:27:55.023732 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.023723 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerName="util" Apr 20 19:27:55.023868 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.023739 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerName="extract" Apr 20 19:27:55.023868 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.023745 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerName="extract" Apr 20 19:27:55.023868 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.023805 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee7ed166-6efe-4c5e-a146-b70feb6121ef" containerName="extract" Apr 20 19:27:55.025763 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.025748 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:55.028369 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.028343 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-tmmkq\"" Apr 20 19:27:55.028454 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.028349 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 19:27:55.029231 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.029215 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 19:27:55.033494 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.033473 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-qrg4v"] Apr 20 19:27:55.135825 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.135790 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1650db9-c55f-4a18-b87b-a19bc905b6b4-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-qrg4v\" (UID: \"a1650db9-c55f-4a18-b87b-a19bc905b6b4\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:55.136015 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.135828 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t6g4\" (UniqueName: \"kubernetes.io/projected/a1650db9-c55f-4a18-b87b-a19bc905b6b4-kube-api-access-2t6g4\") pod \"cert-manager-webhook-587ccfb98-qrg4v\" (UID: \"a1650db9-c55f-4a18-b87b-a19bc905b6b4\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:55.236759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.236725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1650db9-c55f-4a18-b87b-a19bc905b6b4-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-qrg4v\" (UID: \"a1650db9-c55f-4a18-b87b-a19bc905b6b4\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:55.236759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.236761 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t6g4\" (UniqueName: \"kubernetes.io/projected/a1650db9-c55f-4a18-b87b-a19bc905b6b4-kube-api-access-2t6g4\") pod \"cert-manager-webhook-587ccfb98-qrg4v\" (UID: \"a1650db9-c55f-4a18-b87b-a19bc905b6b4\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:55.245726 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.245691 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a1650db9-c55f-4a18-b87b-a19bc905b6b4-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-qrg4v\" (UID: \"a1650db9-c55f-4a18-b87b-a19bc905b6b4\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:55.245854 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.245771 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t6g4\" (UniqueName: \"kubernetes.io/projected/a1650db9-c55f-4a18-b87b-a19bc905b6b4-kube-api-access-2t6g4\") pod \"cert-manager-webhook-587ccfb98-qrg4v\" (UID: \"a1650db9-c55f-4a18-b87b-a19bc905b6b4\") " pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:55.348092 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.348022 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:55.472479 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:55.472447 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-qrg4v"] Apr 20 19:27:55.475442 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:27:55.475410 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1650db9_c55f_4a18_b87b_a19bc905b6b4.slice/crio-8b02054c279ad3e7b583aefeafcf8b7352ae0aff4f4aafaa85284b9354f9bad7 WatchSource:0}: Error finding container 8b02054c279ad3e7b583aefeafcf8b7352ae0aff4f4aafaa85284b9354f9bad7: Status 404 returned error can't find the container with id 8b02054c279ad3e7b583aefeafcf8b7352ae0aff4f4aafaa85284b9354f9bad7 Apr 20 19:27:56.009711 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:56.009678 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" event={"ID":"a1650db9-c55f-4a18-b87b-a19bc905b6b4","Type":"ContainerStarted","Data":"8b02054c279ad3e7b583aefeafcf8b7352ae0aff4f4aafaa85284b9354f9bad7"} Apr 20 19:27:58.815320 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.815288 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4q2w7"] Apr 20 19:27:58.817541 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.817525 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" Apr 20 19:27:58.819682 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.819662 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-mjw92\"" Apr 20 19:27:58.827737 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.827716 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4q2w7"] Apr 20 19:27:58.867977 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.867944 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpqfk\" (UniqueName: \"kubernetes.io/projected/7169e849-2277-4660-88ab-464642cf1f28-kube-api-access-qpqfk\") pod \"cert-manager-cainjector-68b757865b-4q2w7\" (UID: \"7169e849-2277-4660-88ab-464642cf1f28\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" Apr 20 19:27:58.867977 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.867978 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7169e849-2277-4660-88ab-464642cf1f28-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4q2w7\" (UID: \"7169e849-2277-4660-88ab-464642cf1f28\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" Apr 20 19:27:58.968943 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.968903 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpqfk\" (UniqueName: \"kubernetes.io/projected/7169e849-2277-4660-88ab-464642cf1f28-kube-api-access-qpqfk\") pod \"cert-manager-cainjector-68b757865b-4q2w7\" (UID: \"7169e849-2277-4660-88ab-464642cf1f28\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" Apr 20 19:27:58.968943 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.968942 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7169e849-2277-4660-88ab-464642cf1f28-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4q2w7\" (UID: \"7169e849-2277-4660-88ab-464642cf1f28\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" Apr 20 19:27:58.976896 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.976864 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7169e849-2277-4660-88ab-464642cf1f28-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4q2w7\" (UID: \"7169e849-2277-4660-88ab-464642cf1f28\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" Apr 20 19:27:58.977041 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:58.976946 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpqfk\" (UniqueName: \"kubernetes.io/projected/7169e849-2277-4660-88ab-464642cf1f28-kube-api-access-qpqfk\") pod \"cert-manager-cainjector-68b757865b-4q2w7\" (UID: \"7169e849-2277-4660-88ab-464642cf1f28\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" Apr 20 19:27:59.022729 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:59.022692 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" event={"ID":"a1650db9-c55f-4a18-b87b-a19bc905b6b4","Type":"ContainerStarted","Data":"c87b42e659a66a5016145426703043c57d422ad2b677184358e8608103cce73a"} Apr 20 19:27:59.022888 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:59.022776 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:27:59.038847 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:59.038801 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" podStartSLOduration=1.197411086 podStartE2EDuration="4.038789601s" podCreationTimestamp="2026-04-20 19:27:55 +0000 UTC" firstStartedPulling="2026-04-20 19:27:55.477287697 +0000 UTC m=+442.632837496" lastFinishedPulling="2026-04-20 19:27:58.31866621 +0000 UTC m=+445.474216011" observedRunningTime="2026-04-20 19:27:59.037245741 +0000 UTC m=+446.192795561" watchObservedRunningTime="2026-04-20 19:27:59.038789601 +0000 UTC m=+446.194339420" Apr 20 19:27:59.126172 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:59.126082 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" Apr 20 19:27:59.251399 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:27:59.251362 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4q2w7"] Apr 20 19:27:59.254020 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:27:59.253984 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7169e849_2277_4660_88ab_464642cf1f28.slice/crio-c95068c421d99ac3c426965ab09a35011aeb3cf4593d563a309d639af098bfc5 WatchSource:0}: Error finding container c95068c421d99ac3c426965ab09a35011aeb3cf4593d563a309d639af098bfc5: Status 404 returned error can't find the container with id c95068c421d99ac3c426965ab09a35011aeb3cf4593d563a309d639af098bfc5 Apr 20 19:28:00.027330 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:00.027291 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" event={"ID":"7169e849-2277-4660-88ab-464642cf1f28","Type":"ContainerStarted","Data":"131f0cdbbf517d554e624ca2d040d96a22b9455d23fa9789b344a84eac526dcf"} Apr 20 19:28:00.027330 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:00.027330 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" event={"ID":"7169e849-2277-4660-88ab-464642cf1f28","Type":"ContainerStarted","Data":"c95068c421d99ac3c426965ab09a35011aeb3cf4593d563a309d639af098bfc5"} Apr 20 19:28:00.042021 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:00.041970 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-4q2w7" podStartSLOduration=2.041952757 podStartE2EDuration="2.041952757s" podCreationTimestamp="2026-04-20 19:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:28:00.041455227 +0000 UTC m=+447.197005045" watchObservedRunningTime="2026-04-20 19:28:00.041952757 +0000 UTC m=+447.197502577" Apr 20 19:28:05.029811 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:05.029774 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-qrg4v" Apr 20 19:28:08.996790 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:08.996728 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659"] Apr 20 19:28:08.999609 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:08.999592 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.002080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.002055 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:28:09.002216 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.002055 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rbrg7\"" Apr 20 19:28:09.002216 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.002095 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:28:09.008194 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.008169 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659"] Apr 20 19:28:09.049706 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.049671 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz5cn\" (UniqueName: \"kubernetes.io/projected/92c74f7d-ce11-4300-913d-4a83e8122a3d-kube-api-access-nz5cn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.049897 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.049716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.049897 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.049784 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.150542 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.150501 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz5cn\" (UniqueName: \"kubernetes.io/projected/92c74f7d-ce11-4300-913d-4a83e8122a3d-kube-api-access-nz5cn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.150745 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.150549 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.150745 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.150586 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.150967 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.150943 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.151057 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.151012 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.158623 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.158597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz5cn\" (UniqueName: \"kubernetes.io/projected/92c74f7d-ce11-4300-913d-4a83e8122a3d-kube-api-access-nz5cn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.310951 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.310899 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:09.444188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:09.444129 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659"] Apr 20 19:28:09.446720 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:28:09.446690 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92c74f7d_ce11_4300_913d_4a83e8122a3d.slice/crio-141ebf8da98e1e060c66922b306418dbc815f967e779052a15e88044afa36a94 WatchSource:0}: Error finding container 141ebf8da98e1e060c66922b306418dbc815f967e779052a15e88044afa36a94: Status 404 returned error can't find the container with id 141ebf8da98e1e060c66922b306418dbc815f967e779052a15e88044afa36a94 Apr 20 19:28:10.063702 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:10.063664 2580 generic.go:358] "Generic (PLEG): container finished" podID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerID="4af8d425625d7a84c60286d596184bbb43a7360811274e9915be118b194c3afb" exitCode=0 Apr 20 19:28:10.064101 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:10.063752 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" event={"ID":"92c74f7d-ce11-4300-913d-4a83e8122a3d","Type":"ContainerDied","Data":"4af8d425625d7a84c60286d596184bbb43a7360811274e9915be118b194c3afb"} Apr 20 19:28:10.064101 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:10.063784 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" event={"ID":"92c74f7d-ce11-4300-913d-4a83e8122a3d","Type":"ContainerStarted","Data":"141ebf8da98e1e060c66922b306418dbc815f967e779052a15e88044afa36a94"} Apr 20 19:28:11.069793 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:11.069759 2580 generic.go:358] "Generic (PLEG): container finished" podID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerID="18194955a53e41866ff655c5c1a51d493bc100d1fa1ec5120592b894fca47af3" exitCode=0 Apr 20 19:28:11.070274 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:11.069848 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" event={"ID":"92c74f7d-ce11-4300-913d-4a83e8122a3d","Type":"ContainerDied","Data":"18194955a53e41866ff655c5c1a51d493bc100d1fa1ec5120592b894fca47af3"} Apr 20 19:28:12.075927 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:12.075888 2580 generic.go:358] "Generic (PLEG): container finished" podID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerID="e05d680e50c0a14e7af8eca000a91aa35121b443ee90b18862d89a981f17b97b" exitCode=0 Apr 20 19:28:12.076431 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:12.075974 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" event={"ID":"92c74f7d-ce11-4300-913d-4a83e8122a3d","Type":"ContainerDied","Data":"e05d680e50c0a14e7af8eca000a91aa35121b443ee90b18862d89a981f17b97b"} Apr 20 19:28:13.209241 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.209217 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:13.287243 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.287204 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-util\") pod \"92c74f7d-ce11-4300-913d-4a83e8122a3d\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " Apr 20 19:28:13.287450 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.287314 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-bundle\") pod \"92c74f7d-ce11-4300-913d-4a83e8122a3d\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " Apr 20 19:28:13.287450 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.287360 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz5cn\" (UniqueName: \"kubernetes.io/projected/92c74f7d-ce11-4300-913d-4a83e8122a3d-kube-api-access-nz5cn\") pod \"92c74f7d-ce11-4300-913d-4a83e8122a3d\" (UID: \"92c74f7d-ce11-4300-913d-4a83e8122a3d\") " Apr 20 19:28:13.288119 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.288086 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-bundle" (OuterVolumeSpecName: "bundle") pod "92c74f7d-ce11-4300-913d-4a83e8122a3d" (UID: "92c74f7d-ce11-4300-913d-4a83e8122a3d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:13.289667 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.289639 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92c74f7d-ce11-4300-913d-4a83e8122a3d-kube-api-access-nz5cn" (OuterVolumeSpecName: "kube-api-access-nz5cn") pod "92c74f7d-ce11-4300-913d-4a83e8122a3d" (UID: "92c74f7d-ce11-4300-913d-4a83e8122a3d"). InnerVolumeSpecName "kube-api-access-nz5cn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:28:13.293050 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.293029 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-util" (OuterVolumeSpecName: "util") pod "92c74f7d-ce11-4300-913d-4a83e8122a3d" (UID: "92c74f7d-ce11-4300-913d-4a83e8122a3d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:13.387905 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.387832 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:13.387905 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.387858 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nz5cn\" (UniqueName: \"kubernetes.io/projected/92c74f7d-ce11-4300-913d-4a83e8122a3d-kube-api-access-nz5cn\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:13.387905 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.387872 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/92c74f7d-ce11-4300-913d-4a83e8122a3d-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:13.956013 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.955979 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-j5bdv"] Apr 20 19:28:13.956358 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.956344 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerName="util" Apr 20 19:28:13.956417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.956360 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerName="util" Apr 20 19:28:13.956417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.956367 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerName="pull" Apr 20 19:28:13.956417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.956372 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerName="pull" Apr 20 19:28:13.956417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.956381 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerName="extract" Apr 20 19:28:13.956417 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.956386 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerName="extract" Apr 20 19:28:13.956575 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.956447 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="92c74f7d-ce11-4300-913d-4a83e8122a3d" containerName="extract" Apr 20 19:28:13.958791 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.958772 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-j5bdv" Apr 20 19:28:13.960986 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.960965 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-wjwkv\"" Apr 20 19:28:13.967637 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.967613 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-j5bdv"] Apr 20 19:28:13.993528 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.993495 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdbc7017-ec91-4835-a32d-db057733929f-bound-sa-token\") pod \"cert-manager-79c8d999ff-j5bdv\" (UID: \"bdbc7017-ec91-4835-a32d-db057733929f\") " pod="cert-manager/cert-manager-79c8d999ff-j5bdv" Apr 20 19:28:13.993707 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:13.993543 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zspq4\" (UniqueName: \"kubernetes.io/projected/bdbc7017-ec91-4835-a32d-db057733929f-kube-api-access-zspq4\") pod \"cert-manager-79c8d999ff-j5bdv\" (UID: \"bdbc7017-ec91-4835-a32d-db057733929f\") " pod="cert-manager/cert-manager-79c8d999ff-j5bdv" Apr 20 19:28:14.085538 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.085498 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" event={"ID":"92c74f7d-ce11-4300-913d-4a83e8122a3d","Type":"ContainerDied","Data":"141ebf8da98e1e060c66922b306418dbc815f967e779052a15e88044afa36a94"} Apr 20 19:28:14.085538 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.085536 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="141ebf8da98e1e060c66922b306418dbc815f967e779052a15e88044afa36a94" Apr 20 19:28:14.085538 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.085541 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5f4659" Apr 20 19:28:14.095011 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.094972 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zspq4\" (UniqueName: \"kubernetes.io/projected/bdbc7017-ec91-4835-a32d-db057733929f-kube-api-access-zspq4\") pod \"cert-manager-79c8d999ff-j5bdv\" (UID: \"bdbc7017-ec91-4835-a32d-db057733929f\") " pod="cert-manager/cert-manager-79c8d999ff-j5bdv" Apr 20 19:28:14.095192 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.095103 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdbc7017-ec91-4835-a32d-db057733929f-bound-sa-token\") pod \"cert-manager-79c8d999ff-j5bdv\" (UID: \"bdbc7017-ec91-4835-a32d-db057733929f\") " pod="cert-manager/cert-manager-79c8d999ff-j5bdv" Apr 20 19:28:14.103070 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.103040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bdbc7017-ec91-4835-a32d-db057733929f-bound-sa-token\") pod \"cert-manager-79c8d999ff-j5bdv\" (UID: \"bdbc7017-ec91-4835-a32d-db057733929f\") " pod="cert-manager/cert-manager-79c8d999ff-j5bdv" Apr 20 19:28:14.103183 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.103163 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zspq4\" (UniqueName: \"kubernetes.io/projected/bdbc7017-ec91-4835-a32d-db057733929f-kube-api-access-zspq4\") pod \"cert-manager-79c8d999ff-j5bdv\" (UID: \"bdbc7017-ec91-4835-a32d-db057733929f\") " pod="cert-manager/cert-manager-79c8d999ff-j5bdv" Apr 20 19:28:14.269775 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.269741 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-j5bdv" Apr 20 19:28:14.398480 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:14.398456 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-j5bdv"] Apr 20 19:28:14.400886 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:28:14.400859 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdbc7017_ec91_4835_a32d_db057733929f.slice/crio-a0c29edda191a2b8fd0200159276dc859ce062c88cd66fdbacaaf25c39fc12e7 WatchSource:0}: Error finding container a0c29edda191a2b8fd0200159276dc859ce062c88cd66fdbacaaf25c39fc12e7: Status 404 returned error can't find the container with id a0c29edda191a2b8fd0200159276dc859ce062c88cd66fdbacaaf25c39fc12e7 Apr 20 19:28:15.096571 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:15.096537 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-j5bdv" event={"ID":"bdbc7017-ec91-4835-a32d-db057733929f","Type":"ContainerStarted","Data":"aa98dcace7fda10a48caad5d9a718d954e66dda0d27e9ee2840f98bd5e947018"} Apr 20 19:28:15.096571 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:15.096571 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-j5bdv" event={"ID":"bdbc7017-ec91-4835-a32d-db057733929f","Type":"ContainerStarted","Data":"a0c29edda191a2b8fd0200159276dc859ce062c88cd66fdbacaaf25c39fc12e7"} Apr 20 19:28:15.115944 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:15.115900 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-j5bdv" podStartSLOduration=2.115887511 podStartE2EDuration="2.115887511s" podCreationTimestamp="2026-04-20 19:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:28:15.11440969 +0000 UTC m=+462.269959510" watchObservedRunningTime="2026-04-20 19:28:15.115887511 +0000 UTC m=+462.271437329" Apr 20 19:28:19.856715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.856678 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px"] Apr 20 19:28:19.861188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.861168 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:19.863597 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.863576 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:28:19.863713 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.863670 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:28:19.864460 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.864439 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rbrg7\"" Apr 20 19:28:19.873935 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.873910 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px"] Apr 20 19:28:19.948006 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.947966 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:19.948006 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.948016 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqp5\" (UniqueName: \"kubernetes.io/projected/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-kube-api-access-qwqp5\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:19.948231 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:19.948093 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:20.048641 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:20.048598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:20.048825 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:20.048649 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqp5\" (UniqueName: \"kubernetes.io/projected/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-kube-api-access-qwqp5\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:20.048825 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:20.048680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:20.048998 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:20.048978 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:20.049049 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:20.049016 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:20.058291 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:20.058240 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqp5\" (UniqueName: \"kubernetes.io/projected/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-kube-api-access-qwqp5\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:20.171202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:20.171121 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:20.311359 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:20.311328 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px"] Apr 20 19:28:20.314978 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:28:20.314951 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c43ed9_6ebc_43c4_a8e3_91fe2ca627ee.slice/crio-e2244479544b5a6b2f185bc25af560de5b158d886f77a4b7f1f4201d17e2ad1d WatchSource:0}: Error finding container e2244479544b5a6b2f185bc25af560de5b158d886f77a4b7f1f4201d17e2ad1d: Status 404 returned error can't find the container with id e2244479544b5a6b2f185bc25af560de5b158d886f77a4b7f1f4201d17e2ad1d Apr 20 19:28:21.120788 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.120748 2580 generic.go:358] "Generic (PLEG): container finished" podID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerID="78b50d920f3c75fc1fe661afc6d9fbb014f06b522e741ee31c0786c68f7daa54" exitCode=0 Apr 20 19:28:21.121167 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.120839 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" event={"ID":"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee","Type":"ContainerDied","Data":"78b50d920f3c75fc1fe661afc6d9fbb014f06b522e741ee31c0786c68f7daa54"} Apr 20 19:28:21.121167 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.120884 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" event={"ID":"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee","Type":"ContainerStarted","Data":"e2244479544b5a6b2f185bc25af560de5b158d886f77a4b7f1f4201d17e2ad1d"} Apr 20 19:28:21.306128 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.306092 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh"] Apr 20 19:28:21.308581 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.308560 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.311369 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.311343 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:28:21.311465 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.311392 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:28:21.311609 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.311594 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:28:21.311710 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.311689 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pkszx\"" Apr 20 19:28:21.311894 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.311874 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:28:21.327908 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.327886 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh"] Apr 20 19:28:21.367649 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.367614 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwcqd\" (UniqueName: \"kubernetes.io/projected/ff798f5c-331d-412b-8748-5adc81c3d101-kube-api-access-jwcqd\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.367833 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.367699 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff798f5c-331d-412b-8748-5adc81c3d101-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.367833 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.367754 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff798f5c-331d-412b-8748-5adc81c3d101-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.469554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.468886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff798f5c-331d-412b-8748-5adc81c3d101-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.469554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.469007 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwcqd\" (UniqueName: \"kubernetes.io/projected/ff798f5c-331d-412b-8748-5adc81c3d101-kube-api-access-jwcqd\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.469554 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.469052 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff798f5c-331d-412b-8748-5adc81c3d101-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.472776 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.472743 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff798f5c-331d-412b-8748-5adc81c3d101-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.473503 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.473478 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff798f5c-331d-412b-8748-5adc81c3d101-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.481693 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.481669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwcqd\" (UniqueName: \"kubernetes.io/projected/ff798f5c-331d-412b-8748-5adc81c3d101-kube-api-access-jwcqd\") pod \"opendatahub-operator-controller-manager-7875d57869-w7znh\" (UID: \"ff798f5c-331d-412b-8748-5adc81c3d101\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.621011 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.620927 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:21.803448 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:21.803417 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh"] Apr 20 19:28:21.838329 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:28:21.838301 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff798f5c_331d_412b_8748_5adc81c3d101.slice/crio-3846c4c6a8e3a3b16b0078ebcf3f8f3191e1f44b571d61fbfe077568a8830059 WatchSource:0}: Error finding container 3846c4c6a8e3a3b16b0078ebcf3f8f3191e1f44b571d61fbfe077568a8830059: Status 404 returned error can't find the container with id 3846c4c6a8e3a3b16b0078ebcf3f8f3191e1f44b571d61fbfe077568a8830059 Apr 20 19:28:22.125575 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:22.125538 2580 generic.go:358] "Generic (PLEG): container finished" podID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerID="a8a7ff3fb7581d302bb9fad493e2fd992a9c238dc4e865aa93d2a37185ec2595" exitCode=0 Apr 20 19:28:22.126003 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:22.125623 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" event={"ID":"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee","Type":"ContainerDied","Data":"a8a7ff3fb7581d302bb9fad493e2fd992a9c238dc4e865aa93d2a37185ec2595"} Apr 20 19:28:22.126994 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:22.126972 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" event={"ID":"ff798f5c-331d-412b-8748-5adc81c3d101","Type":"ContainerStarted","Data":"3846c4c6a8e3a3b16b0078ebcf3f8f3191e1f44b571d61fbfe077568a8830059"} Apr 20 19:28:23.133287 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:23.133227 2580 generic.go:358] "Generic (PLEG): container finished" podID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerID="c5cf260281aaace32dad8dd64ca31300cb114db8068072f6b9ff0d2ee0257158" exitCode=0 Apr 20 19:28:23.133730 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:23.133287 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" event={"ID":"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee","Type":"ContainerDied","Data":"c5cf260281aaace32dad8dd64ca31300cb114db8068072f6b9ff0d2ee0257158"} Apr 20 19:28:24.343005 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.342983 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:24.395859 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.395785 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-util\") pod \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " Apr 20 19:28:24.395999 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.395878 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-bundle\") pod \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " Apr 20 19:28:24.395999 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.395955 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwqp5\" (UniqueName: \"kubernetes.io/projected/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-kube-api-access-qwqp5\") pod \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\" (UID: \"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee\") " Apr 20 19:28:24.397138 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.397098 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-bundle" (OuterVolumeSpecName: "bundle") pod "01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" (UID: "01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:24.398308 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.398277 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-kube-api-access-qwqp5" (OuterVolumeSpecName: "kube-api-access-qwqp5") pod "01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" (UID: "01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee"). InnerVolumeSpecName "kube-api-access-qwqp5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:28:24.404021 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.403992 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-util" (OuterVolumeSpecName: "util") pod "01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" (UID: "01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:24.496592 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.496547 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwqp5\" (UniqueName: \"kubernetes.io/projected/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-kube-api-access-qwqp5\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:24.496592 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.496588 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:24.496592 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:24.496599 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:25.143281 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:25.143215 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" event={"ID":"ff798f5c-331d-412b-8748-5adc81c3d101","Type":"ContainerStarted","Data":"6beb226c0d0a7db1e32c12a54380fba3e1164e031e706de521e14b8f457ccad4"} Apr 20 19:28:25.143483 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:25.143458 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:25.145016 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:25.144991 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" event={"ID":"01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee","Type":"ContainerDied","Data":"e2244479544b5a6b2f185bc25af560de5b158d886f77a4b7f1f4201d17e2ad1d"} Apr 20 19:28:25.145016 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:25.145018 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2244479544b5a6b2f185bc25af560de5b158d886f77a4b7f1f4201d17e2ad1d" Apr 20 19:28:25.145195 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:25.145028 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9742px" Apr 20 19:28:25.172031 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:25.171980 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" podStartSLOduration=1.773054898 podStartE2EDuration="4.171966398s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:21.840153145 +0000 UTC m=+468.995702944" lastFinishedPulling="2026-04-20 19:28:24.239064635 +0000 UTC m=+471.394614444" observedRunningTime="2026-04-20 19:28:25.169207191 +0000 UTC m=+472.324757022" watchObservedRunningTime="2026-04-20 19:28:25.171966398 +0000 UTC m=+472.327516217" Apr 20 19:28:33.260013 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.259978 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn"] Apr 20 19:28:33.260558 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.260538 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerName="extract" Apr 20 19:28:33.260601 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.260564 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerName="extract" Apr 20 19:28:33.260601 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.260590 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerName="util" Apr 20 19:28:33.260601 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.260598 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerName="util" Apr 20 19:28:33.260700 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.260633 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerName="pull" Apr 20 19:28:33.260700 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.260642 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerName="pull" Apr 20 19:28:33.260762 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.260735 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="01c43ed9-6ebc-43c4-a8e3-91fe2ca627ee" containerName="extract" Apr 20 19:28:33.267159 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.267137 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.272644 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.272603 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn"] Apr 20 19:28:33.272954 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.272933 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:28:33.273165 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.273147 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 19:28:33.274127 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.274099 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:28:33.274496 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.274474 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5slkl\"" Apr 20 19:28:33.274779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.274502 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 19:28:33.274995 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.274543 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 19:28:33.377890 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.377851 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw7m4\" (UniqueName: \"kubernetes.io/projected/a2827786-8b73-49ef-92e2-8988ac55b679-kube-api-access-xw7m4\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.378080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.377920 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2827786-8b73-49ef-92e2-8988ac55b679-cert\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.378080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.377938 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2827786-8b73-49ef-92e2-8988ac55b679-metrics-cert\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.378080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.378055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a2827786-8b73-49ef-92e2-8988ac55b679-manager-config\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.478603 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.478574 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2827786-8b73-49ef-92e2-8988ac55b679-cert\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.478603 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.478608 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2827786-8b73-49ef-92e2-8988ac55b679-metrics-cert\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.478841 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.478654 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a2827786-8b73-49ef-92e2-8988ac55b679-manager-config\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.478841 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.478707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xw7m4\" (UniqueName: \"kubernetes.io/projected/a2827786-8b73-49ef-92e2-8988ac55b679-kube-api-access-xw7m4\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.480898 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.480871 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 19:28:33.480991 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.480919 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 19:28:33.481025 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.480994 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 19:28:33.486342 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.486325 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:28:33.489493 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.489467 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/a2827786-8b73-49ef-92e2-8988ac55b679-manager-config\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.491514 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.491480 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2827786-8b73-49ef-92e2-8988ac55b679-metrics-cert\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.491610 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.491483 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2827786-8b73-49ef-92e2-8988ac55b679-cert\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.496882 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.496861 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:28:33.507349 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.507324 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw7m4\" (UniqueName: \"kubernetes.io/projected/a2827786-8b73-49ef-92e2-8988ac55b679-kube-api-access-xw7m4\") pod \"lws-controller-manager-5c6db948fd-794jn\" (UID: \"a2827786-8b73-49ef-92e2-8988ac55b679\") " pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.585342 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.585245 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5slkl\"" Apr 20 19:28:33.593329 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.593297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:33.729310 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:33.729282 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn"] Apr 20 19:28:33.731529 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:28:33.731491 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2827786_8b73_49ef_92e2_8988ac55b679.slice/crio-26d308b8dd16f4146724baef9a2150df75a29c21e88a11868e16b6fc422f1dc3 WatchSource:0}: Error finding container 26d308b8dd16f4146724baef9a2150df75a29c21e88a11868e16b6fc422f1dc3: Status 404 returned error can't find the container with id 26d308b8dd16f4146724baef9a2150df75a29c21e88a11868e16b6fc422f1dc3 Apr 20 19:28:34.179864 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:34.179820 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" event={"ID":"a2827786-8b73-49ef-92e2-8988ac55b679","Type":"ContainerStarted","Data":"26d308b8dd16f4146724baef9a2150df75a29c21e88a11868e16b6fc422f1dc3"} Apr 20 19:28:36.152153 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:36.152119 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-w7znh" Apr 20 19:28:36.190687 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:36.190647 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" event={"ID":"a2827786-8b73-49ef-92e2-8988ac55b679","Type":"ContainerStarted","Data":"b4cd31267170fcb97787aacf423e662ab0cc7e587351969f52ecb245fa6aa5f7"} Apr 20 19:28:36.190862 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:36.190721 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:36.216604 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:36.216539 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" podStartSLOduration=1.6023108160000001 podStartE2EDuration="3.216518643s" podCreationTimestamp="2026-04-20 19:28:33 +0000 UTC" firstStartedPulling="2026-04-20 19:28:33.733376272 +0000 UTC m=+480.888926069" lastFinishedPulling="2026-04-20 19:28:35.347584085 +0000 UTC m=+482.503133896" observedRunningTime="2026-04-20 19:28:36.214919753 +0000 UTC m=+483.370469574" watchObservedRunningTime="2026-04-20 19:28:36.216518643 +0000 UTC m=+483.372068463" Apr 20 19:28:38.497945 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.497903 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw"] Apr 20 19:28:38.508475 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.508447 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.509775 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.509746 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw"] Apr 20 19:28:38.510856 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.510837 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rbrg7\"" Apr 20 19:28:38.510979 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.510940 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:28:38.511614 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.511598 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:28:38.624061 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.624025 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.624238 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.624072 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.624238 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.624139 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvgvl\" (UniqueName: \"kubernetes.io/projected/62595255-b63d-472c-9efc-8aecdb2f1f51-kube-api-access-mvgvl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.725438 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.725397 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.725618 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.725451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.725618 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.725488 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvgvl\" (UniqueName: \"kubernetes.io/projected/62595255-b63d-472c-9efc-8aecdb2f1f51-kube-api-access-mvgvl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.725885 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.725859 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.725932 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.725869 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.734078 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.734054 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvgvl\" (UniqueName: \"kubernetes.io/projected/62595255-b63d-472c-9efc-8aecdb2f1f51-kube-api-access-mvgvl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.819776 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.819742 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:38.947729 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:38.947673 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw"] Apr 20 19:28:38.949941 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:28:38.949910 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62595255_b63d_472c_9efc_8aecdb2f1f51.slice/crio-8fbab6348fdc57417fb036938a7399a28748f8c35a5688d589b19bebfe03872d WatchSource:0}: Error finding container 8fbab6348fdc57417fb036938a7399a28748f8c35a5688d589b19bebfe03872d: Status 404 returned error can't find the container with id 8fbab6348fdc57417fb036938a7399a28748f8c35a5688d589b19bebfe03872d Apr 20 19:28:39.203265 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:39.203162 2580 generic.go:358] "Generic (PLEG): container finished" podID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerID="716e87ff6457c7aa7f435cf8bf9f33f0f6e69f4fde1e258ca994847cb892ff9b" exitCode=0 Apr 20 19:28:39.203265 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:39.203224 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" event={"ID":"62595255-b63d-472c-9efc-8aecdb2f1f51","Type":"ContainerDied","Data":"716e87ff6457c7aa7f435cf8bf9f33f0f6e69f4fde1e258ca994847cb892ff9b"} Apr 20 19:28:39.203444 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:39.203287 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" event={"ID":"62595255-b63d-472c-9efc-8aecdb2f1f51","Type":"ContainerStarted","Data":"8fbab6348fdc57417fb036938a7399a28748f8c35a5688d589b19bebfe03872d"} Apr 20 19:28:40.208986 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:40.208952 2580 generic.go:358] "Generic (PLEG): container finished" podID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerID="5c1664e5f7e61abd452c46a6a8651c6aa674069c255fd7b85f5557e5a703e1f8" exitCode=0 Apr 20 19:28:40.209364 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:40.209043 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" event={"ID":"62595255-b63d-472c-9efc-8aecdb2f1f51","Type":"ContainerDied","Data":"5c1664e5f7e61abd452c46a6a8651c6aa674069c255fd7b85f5557e5a703e1f8"} Apr 20 19:28:41.214143 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:41.214108 2580 generic.go:358] "Generic (PLEG): container finished" podID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerID="2f1cbfef2c65dcb736a74bed8d564bcb39c54827a4a0336e18e31c298f6319c6" exitCode=0 Apr 20 19:28:41.214601 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:41.214197 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" event={"ID":"62595255-b63d-472c-9efc-8aecdb2f1f51","Type":"ContainerDied","Data":"2f1cbfef2c65dcb736a74bed8d564bcb39c54827a4a0336e18e31c298f6319c6"} Apr 20 19:28:42.374199 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.374170 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:42.560239 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.560210 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-bundle\") pod \"62595255-b63d-472c-9efc-8aecdb2f1f51\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " Apr 20 19:28:42.560445 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.560270 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvgvl\" (UniqueName: \"kubernetes.io/projected/62595255-b63d-472c-9efc-8aecdb2f1f51-kube-api-access-mvgvl\") pod \"62595255-b63d-472c-9efc-8aecdb2f1f51\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " Apr 20 19:28:42.560445 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.560323 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-util\") pod \"62595255-b63d-472c-9efc-8aecdb2f1f51\" (UID: \"62595255-b63d-472c-9efc-8aecdb2f1f51\") " Apr 20 19:28:42.561103 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.561074 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-bundle" (OuterVolumeSpecName: "bundle") pod "62595255-b63d-472c-9efc-8aecdb2f1f51" (UID: "62595255-b63d-472c-9efc-8aecdb2f1f51"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:42.562572 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.562552 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62595255-b63d-472c-9efc-8aecdb2f1f51-kube-api-access-mvgvl" (OuterVolumeSpecName: "kube-api-access-mvgvl") pod "62595255-b63d-472c-9efc-8aecdb2f1f51" (UID: "62595255-b63d-472c-9efc-8aecdb2f1f51"). InnerVolumeSpecName "kube-api-access-mvgvl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:28:42.565523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.565499 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-util" (OuterVolumeSpecName: "util") pod "62595255-b63d-472c-9efc-8aecdb2f1f51" (UID: "62595255-b63d-472c-9efc-8aecdb2f1f51"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:42.661314 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.661277 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:42.661314 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.661309 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvgvl\" (UniqueName: \"kubernetes.io/projected/62595255-b63d-472c-9efc-8aecdb2f1f51-kube-api-access-mvgvl\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:42.661521 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:42.661323 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62595255-b63d-472c-9efc-8aecdb2f1f51-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:43.222661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:43.222628 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" event={"ID":"62595255-b63d-472c-9efc-8aecdb2f1f51","Type":"ContainerDied","Data":"8fbab6348fdc57417fb036938a7399a28748f8c35a5688d589b19bebfe03872d"} Apr 20 19:28:43.222661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:43.222652 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359f7qw" Apr 20 19:28:43.222661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:43.222662 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fbab6348fdc57417fb036938a7399a28748f8c35a5688d589b19bebfe03872d" Apr 20 19:28:47.197628 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.197599 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5c6db948fd-794jn" Apr 20 19:28:47.798817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.798782 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf"] Apr 20 19:28:47.799188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.799173 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerName="util" Apr 20 19:28:47.799239 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.799190 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerName="util" Apr 20 19:28:47.799239 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.799200 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerName="pull" Apr 20 19:28:47.799239 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.799205 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerName="pull" Apr 20 19:28:47.799239 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.799220 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerName="extract" Apr 20 19:28:47.799239 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.799226 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerName="extract" Apr 20 19:28:47.799424 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.799316 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="62595255-b63d-472c-9efc-8aecdb2f1f51" containerName="extract" Apr 20 19:28:47.803796 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.803779 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.805167 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.805148 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.805303 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.805212 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.805303 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.805286 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwtf\" (UniqueName: \"kubernetes.io/projected/6b4108cd-b146-47d9-81a9-fb503caa3bb1-kube-api-access-xvwtf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.806065 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.806039 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:28:47.806744 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.806724 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:28:47.806849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.806753 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rbrg7\"" Apr 20 19:28:47.814843 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.814821 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf"] Apr 20 19:28:47.906596 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.906564 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.906596 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.906599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwtf\" (UniqueName: \"kubernetes.io/projected/6b4108cd-b146-47d9-81a9-fb503caa3bb1-kube-api-access-xvwtf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.906834 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.906647 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.906989 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.906967 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.907054 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.907008 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:47.923484 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:47.923450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwtf\" (UniqueName: \"kubernetes.io/projected/6b4108cd-b146-47d9-81a9-fb503caa3bb1-kube-api-access-xvwtf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:48.113894 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:48.113810 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:48.268760 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:28:48.268693 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4108cd_b146_47d9_81a9_fb503caa3bb1.slice/crio-a24c420f6f4732e7549674c9ee10fd577458535e80e63191f29edb63cd0a49d8 WatchSource:0}: Error finding container a24c420f6f4732e7549674c9ee10fd577458535e80e63191f29edb63cd0a49d8: Status 404 returned error can't find the container with id a24c420f6f4732e7549674c9ee10fd577458535e80e63191f29edb63cd0a49d8 Apr 20 19:28:48.270377 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:48.270132 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf"] Apr 20 19:28:49.251740 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:49.251641 2580 generic.go:358] "Generic (PLEG): container finished" podID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerID="3655b88d97ddfee9c3d81c41158ac6119445e186ebc017f105d4347e5a60d93f" exitCode=0 Apr 20 19:28:49.251905 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:49.251745 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" event={"ID":"6b4108cd-b146-47d9-81a9-fb503caa3bb1","Type":"ContainerDied","Data":"3655b88d97ddfee9c3d81c41158ac6119445e186ebc017f105d4347e5a60d93f"} Apr 20 19:28:49.251905 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:49.251782 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" event={"ID":"6b4108cd-b146-47d9-81a9-fb503caa3bb1","Type":"ContainerStarted","Data":"a24c420f6f4732e7549674c9ee10fd577458535e80e63191f29edb63cd0a49d8"} Apr 20 19:28:51.263759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:51.263726 2580 generic.go:358] "Generic (PLEG): container finished" podID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerID="101ae83a7e7042258f078525c2c96fec2b49d5480e043791bb327f7f0891a6ab" exitCode=0 Apr 20 19:28:51.265913 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:51.265126 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" event={"ID":"6b4108cd-b146-47d9-81a9-fb503caa3bb1","Type":"ContainerDied","Data":"101ae83a7e7042258f078525c2c96fec2b49d5480e043791bb327f7f0891a6ab"} Apr 20 19:28:52.271036 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:52.270999 2580 generic.go:358] "Generic (PLEG): container finished" podID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerID="586c46c6904a7ed96364b1bf0a7e7431d98ddf6719bd58cb7de9cfafe75ae6b4" exitCode=0 Apr 20 19:28:52.271439 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:52.271078 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" event={"ID":"6b4108cd-b146-47d9-81a9-fb503caa3bb1","Type":"ContainerDied","Data":"586c46c6904a7ed96364b1bf0a7e7431d98ddf6719bd58cb7de9cfafe75ae6b4"} Apr 20 19:28:53.397507 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.397484 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:28:53.453337 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.453301 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-util\") pod \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " Apr 20 19:28:53.453467 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.453361 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-bundle\") pod \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " Apr 20 19:28:53.453467 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.453396 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvwtf\" (UniqueName: \"kubernetes.io/projected/6b4108cd-b146-47d9-81a9-fb503caa3bb1-kube-api-access-xvwtf\") pod \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\" (UID: \"6b4108cd-b146-47d9-81a9-fb503caa3bb1\") " Apr 20 19:28:53.454234 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.454209 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-bundle" (OuterVolumeSpecName: "bundle") pod "6b4108cd-b146-47d9-81a9-fb503caa3bb1" (UID: "6b4108cd-b146-47d9-81a9-fb503caa3bb1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:53.455626 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.455591 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4108cd-b146-47d9-81a9-fb503caa3bb1-kube-api-access-xvwtf" (OuterVolumeSpecName: "kube-api-access-xvwtf") pod "6b4108cd-b146-47d9-81a9-fb503caa3bb1" (UID: "6b4108cd-b146-47d9-81a9-fb503caa3bb1"). InnerVolumeSpecName "kube-api-access-xvwtf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:28:53.554478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.554410 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:53.554478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.554436 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xvwtf\" (UniqueName: \"kubernetes.io/projected/6b4108cd-b146-47d9-81a9-fb503caa3bb1-kube-api-access-xvwtf\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:53.572006 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.571943 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-util" (OuterVolumeSpecName: "util") pod "6b4108cd-b146-47d9-81a9-fb503caa3bb1" (UID: "6b4108cd-b146-47d9-81a9-fb503caa3bb1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:53.655076 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:53.655047 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b4108cd-b146-47d9-81a9-fb503caa3bb1-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:28:54.280466 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:54.280430 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" event={"ID":"6b4108cd-b146-47d9-81a9-fb503caa3bb1","Type":"ContainerDied","Data":"a24c420f6f4732e7549674c9ee10fd577458535e80e63191f29edb63cd0a49d8"} Apr 20 19:28:54.280466 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:54.280464 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24c420f6f4732e7549674c9ee10fd577458535e80e63191f29edb63cd0a49d8" Apr 20 19:28:54.280681 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:28:54.280501 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n2vvf" Apr 20 19:29:19.309278 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.309224 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz"] Apr 20 19:29:19.310469 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.310445 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerName="extract" Apr 20 19:29:19.310608 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.310597 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerName="extract" Apr 20 19:29:19.310701 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.310692 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerName="pull" Apr 20 19:29:19.310776 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.310766 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerName="pull" Apr 20 19:29:19.310869 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.310860 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerName="util" Apr 20 19:29:19.310942 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.310934 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerName="util" Apr 20 19:29:19.311104 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.311093 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b4108cd-b146-47d9-81a9-fb503caa3bb1" containerName="extract" Apr 20 19:29:19.319756 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.319731 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.323126 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.323093 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz"] Apr 20 19:29:19.326153 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.325976 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-v9j5k\"" Apr 20 19:29:19.326375 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.326176 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 19:29:19.380094 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380067 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.380262 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380103 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.380262 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380123 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.380262 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380142 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.380262 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380159 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqmsk\" (UniqueName: \"kubernetes.io/projected/472cf0ba-cb87-451a-8cce-616a43e88e47-kube-api-access-gqmsk\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.380436 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380297 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.380436 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380324 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/472cf0ba-cb87-451a-8cce-616a43e88e47-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.380436 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.380436 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.380397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481543 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481499 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481543 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481543 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481564 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481591 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481619 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqmsk\" (UniqueName: \"kubernetes.io/projected/472cf0ba-cb87-451a-8cce-616a43e88e47-kube-api-access-gqmsk\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481698 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/472cf0ba-cb87-451a-8cce-616a43e88e47-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481730 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.481785 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481765 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.482145 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.481957 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.482145 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.482036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.482310 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.482209 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.482598 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.482575 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/472cf0ba-cb87-451a-8cce-616a43e88e47-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.482681 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.482517 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.484202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.484183 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.484958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.484935 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.489331 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.489311 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/472cf0ba-cb87-451a-8cce-616a43e88e47-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.489474 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.489456 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqmsk\" (UniqueName: \"kubernetes.io/projected/472cf0ba-cb87-451a-8cce-616a43e88e47-kube-api-access-gqmsk\") pod \"data-science-gateway-data-science-gateway-class-55cc67557ffqnwz\" (UID: \"472cf0ba-cb87-451a-8cce-616a43e88e47\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.636776 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.636708 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:19.769235 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:19.769205 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz"] Apr 20 19:29:19.772452 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:29:19.772427 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod472cf0ba_cb87_451a_8cce_616a43e88e47.slice/crio-444a37bbe275f556d17b8e813d0b215866bebe349358ed5383de364c88b24de7 WatchSource:0}: Error finding container 444a37bbe275f556d17b8e813d0b215866bebe349358ed5383de364c88b24de7: Status 404 returned error can't find the container with id 444a37bbe275f556d17b8e813d0b215866bebe349358ed5383de364c88b24de7 Apr 20 19:29:20.380542 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:20.380487 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" event={"ID":"472cf0ba-cb87-451a-8cce-616a43e88e47","Type":"ContainerStarted","Data":"444a37bbe275f556d17b8e813d0b215866bebe349358ed5383de364c88b24de7"} Apr 20 19:29:22.092899 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:22.092656 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 19:29:22.092899 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:22.092750 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 19:29:22.092899 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:22.092787 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 19:29:22.389852 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:22.389760 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" event={"ID":"472cf0ba-cb87-451a-8cce-616a43e88e47","Type":"ContainerStarted","Data":"284e6f89e3d6d59041d1ffabd497d4cb8a3eed61f841298a5f211c8631392edc"} Apr 20 19:29:22.410778 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:22.410722 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" podStartSLOduration=1.093215625 podStartE2EDuration="3.410707458s" podCreationTimestamp="2026-04-20 19:29:19 +0000 UTC" firstStartedPulling="2026-04-20 19:29:19.774867303 +0000 UTC m=+526.930417100" lastFinishedPulling="2026-04-20 19:29:22.092359124 +0000 UTC m=+529.247908933" observedRunningTime="2026-04-20 19:29:22.409689326 +0000 UTC m=+529.565239156" watchObservedRunningTime="2026-04-20 19:29:22.410707458 +0000 UTC m=+529.566257276" Apr 20 19:29:22.637135 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:22.637085 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:23.641306 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:23.641280 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:24.396576 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:24.396547 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:24.397532 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:24.397513 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557ffqnwz" Apr 20 19:29:39.909315 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:39.909279 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nlc75"] Apr 20 19:29:39.914140 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:39.914120 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:39.916441 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:39.916422 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:29:39.917526 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:39.917405 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:29:39.917526 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:39.917431 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-s8bfx\"" Apr 20 19:29:39.920814 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:39.920780 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nlc75"] Apr 20 19:29:40.065921 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:40.065886 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtv8\" (UniqueName: \"kubernetes.io/projected/ee9a5352-9f0f-4dca-b13a-dd593b551f2c-kube-api-access-vgtv8\") pod \"kuadrant-operator-catalog-nlc75\" (UID: \"ee9a5352-9f0f-4dca-b13a-dd593b551f2c\") " pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:40.166758 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:40.166677 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtv8\" (UniqueName: \"kubernetes.io/projected/ee9a5352-9f0f-4dca-b13a-dd593b551f2c-kube-api-access-vgtv8\") pod \"kuadrant-operator-catalog-nlc75\" (UID: \"ee9a5352-9f0f-4dca-b13a-dd593b551f2c\") " pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:40.175112 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:40.175083 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtv8\" (UniqueName: \"kubernetes.io/projected/ee9a5352-9f0f-4dca-b13a-dd593b551f2c-kube-api-access-vgtv8\") pod \"kuadrant-operator-catalog-nlc75\" (UID: \"ee9a5352-9f0f-4dca-b13a-dd593b551f2c\") " pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:40.224806 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:40.224765 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:40.350901 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:40.350877 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nlc75"] Apr 20 19:29:40.352536 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:29:40.352506 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee9a5352_9f0f_4dca_b13a_dd593b551f2c.slice/crio-9ba195b1cba76e07be7f39ae46031ef6c1567e218de2f5a17acbeace9afaa578 WatchSource:0}: Error finding container 9ba195b1cba76e07be7f39ae46031ef6c1567e218de2f5a17acbeace9afaa578: Status 404 returned error can't find the container with id 9ba195b1cba76e07be7f39ae46031ef6c1567e218de2f5a17acbeace9afaa578 Apr 20 19:29:40.460758 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:40.460674 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" event={"ID":"ee9a5352-9f0f-4dca-b13a-dd593b551f2c","Type":"ContainerStarted","Data":"9ba195b1cba76e07be7f39ae46031ef6c1567e218de2f5a17acbeace9afaa578"} Apr 20 19:29:42.470214 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:42.470165 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" event={"ID":"ee9a5352-9f0f-4dca-b13a-dd593b551f2c","Type":"ContainerStarted","Data":"ea528b0c9f46741f4c22c5f448e2ee178ef34000c6d0ab7069a00fe50f0d470f"} Apr 20 19:29:42.487181 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:42.487115 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" podStartSLOduration=1.491952258 podStartE2EDuration="3.487095481s" podCreationTimestamp="2026-04-20 19:29:39 +0000 UTC" firstStartedPulling="2026-04-20 19:29:40.35398991 +0000 UTC m=+547.509539721" lastFinishedPulling="2026-04-20 19:29:42.349133144 +0000 UTC m=+549.504682944" observedRunningTime="2026-04-20 19:29:42.484491179 +0000 UTC m=+549.640041037" watchObservedRunningTime="2026-04-20 19:29:42.487095481 +0000 UTC m=+549.642645301" Apr 20 19:29:50.224956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:50.224900 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:50.225424 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:50.224973 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:50.247568 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:50.247541 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:50.520469 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:50.520443 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-nlc75" Apr 20 19:29:54.713691 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.713653 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn"] Apr 20 19:29:54.719348 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.719327 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.722107 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.722076 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-j7x8z\"" Apr 20 19:29:54.724664 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.724638 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn"] Apr 20 19:29:54.789147 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.789107 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.789378 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.789164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.789378 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.789316 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxmhn\" (UniqueName: \"kubernetes.io/projected/eb1ca523-7b5c-48c1-91cf-66352befe38a-kube-api-access-nxmhn\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.889946 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.889910 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxmhn\" (UniqueName: \"kubernetes.io/projected/eb1ca523-7b5c-48c1-91cf-66352befe38a-kube-api-access-nxmhn\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.890122 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.889959 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.890122 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.890003 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.890400 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.890382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.890462 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.890420 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:54.897803 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:54.897782 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxmhn\" (UniqueName: \"kubernetes.io/projected/eb1ca523-7b5c-48c1-91cf-66352befe38a-kube-api-access-nxmhn\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:55.030431 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.030403 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:55.163942 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.163915 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn"] Apr 20 19:29:55.165675 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:29:55.165646 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb1ca523_7b5c_48c1_91cf_66352befe38a.slice/crio-4d409a7a697b56873e834e7c2d67870c710e9a42ddfe63d11102ec0dd798f55c WatchSource:0}: Error finding container 4d409a7a697b56873e834e7c2d67870c710e9a42ddfe63d11102ec0dd798f55c: Status 404 returned error can't find the container with id 4d409a7a697b56873e834e7c2d67870c710e9a42ddfe63d11102ec0dd798f55c Apr 20 19:29:55.312497 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.312404 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js"] Apr 20 19:29:55.315001 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.314981 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.324129 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.324101 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js"] Apr 20 19:29:55.394370 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.394334 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.394370 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.394369 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.394598 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.394441 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w4pw\" (UniqueName: \"kubernetes.io/projected/85d30de4-f22c-4375-acb1-005a6556895f-kube-api-access-5w4pw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.495441 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.495393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w4pw\" (UniqueName: \"kubernetes.io/projected/85d30de4-f22c-4375-acb1-005a6556895f-kube-api-access-5w4pw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.495623 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.495531 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.495623 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.495569 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.495912 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.495892 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.495953 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.495932 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.503213 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.503189 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w4pw\" (UniqueName: \"kubernetes.io/projected/85d30de4-f22c-4375-acb1-005a6556895f-kube-api-access-5w4pw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.519151 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.519120 2580 generic.go:358] "Generic (PLEG): container finished" podID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerID="48bf97c191f03c28af0c984634eb34044b5c519e047a7f53d169ee354c7c9a1e" exitCode=0 Apr 20 19:29:55.519314 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.519203 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" event={"ID":"eb1ca523-7b5c-48c1-91cf-66352befe38a","Type":"ContainerDied","Data":"48bf97c191f03c28af0c984634eb34044b5c519e047a7f53d169ee354c7c9a1e"} Apr 20 19:29:55.519314 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.519243 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" event={"ID":"eb1ca523-7b5c-48c1-91cf-66352befe38a","Type":"ContainerStarted","Data":"4d409a7a697b56873e834e7c2d67870c710e9a42ddfe63d11102ec0dd798f55c"} Apr 20 19:29:55.633594 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.633500 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:55.718698 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.718665 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2"] Apr 20 19:29:55.722197 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.722175 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.728233 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.728211 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2"] Apr 20 19:29:55.776280 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.775090 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js"] Apr 20 19:29:55.798489 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.798460 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.798622 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.798523 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9ml7\" (UniqueName: \"kubernetes.io/projected/40afe44c-fd24-405c-9d93-1ddd9db818d2-kube-api-access-q9ml7\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.798696 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.798673 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.899739 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.899641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9ml7\" (UniqueName: \"kubernetes.io/projected/40afe44c-fd24-405c-9d93-1ddd9db818d2-kube-api-access-q9ml7\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.899885 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.899772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.899885 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.899825 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.900196 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.900177 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.900231 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.900211 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:55.907608 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:55.907580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9ml7\" (UniqueName: \"kubernetes.io/projected/40afe44c-fd24-405c-9d93-1ddd9db818d2-kube-api-access-q9ml7\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:56.034726 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.034699 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:56.116958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.116914 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj"] Apr 20 19:29:56.121193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.121167 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.127106 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.127077 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj"] Apr 20 19:29:56.168501 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.168473 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2"] Apr 20 19:29:56.203365 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.203326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwl7v\" (UniqueName: \"kubernetes.io/projected/0f2dbce2-0997-48f9-b99e-7f49e643677c-kube-api-access-hwl7v\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.203550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.203450 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.203621 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.203584 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.209730 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:29:56.209703 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40afe44c_fd24_405c_9d93_1ddd9db818d2.slice/crio-75d0449ad434882addd78323a1c28318b76e00e373af944160708142fda1e8a0 WatchSource:0}: Error finding container 75d0449ad434882addd78323a1c28318b76e00e373af944160708142fda1e8a0: Status 404 returned error can't find the container with id 75d0449ad434882addd78323a1c28318b76e00e373af944160708142fda1e8a0 Apr 20 19:29:56.304609 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.304576 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.304930 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.304626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwl7v\" (UniqueName: \"kubernetes.io/projected/0f2dbce2-0997-48f9-b99e-7f49e643677c-kube-api-access-hwl7v\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.304930 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.304719 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.305073 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.304981 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.305073 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.304987 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.316043 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.316011 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwl7v\" (UniqueName: \"kubernetes.io/projected/0f2dbce2-0997-48f9-b99e-7f49e643677c-kube-api-access-hwl7v\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.435118 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.435040 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:29:56.524761 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.524728 2580 generic.go:358] "Generic (PLEG): container finished" podID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerID="9b211c2d46633111e93f47a8f19068ea05fbe9351c7df740ddaf4706f55e2c30" exitCode=0 Apr 20 19:29:56.524910 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.524814 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" event={"ID":"40afe44c-fd24-405c-9d93-1ddd9db818d2","Type":"ContainerDied","Data":"9b211c2d46633111e93f47a8f19068ea05fbe9351c7df740ddaf4706f55e2c30"} Apr 20 19:29:56.524910 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.524854 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" event={"ID":"40afe44c-fd24-405c-9d93-1ddd9db818d2","Type":"ContainerStarted","Data":"75d0449ad434882addd78323a1c28318b76e00e373af944160708142fda1e8a0"} Apr 20 19:29:56.526326 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.526303 2580 generic.go:358] "Generic (PLEG): container finished" podID="85d30de4-f22c-4375-acb1-005a6556895f" containerID="ccdc6f88c0a706a4075e9a2f3b1019e296ec3e6bf1f59098e35f69eeb4a5fb74" exitCode=0 Apr 20 19:29:56.526523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.526471 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" event={"ID":"85d30de4-f22c-4375-acb1-005a6556895f","Type":"ContainerDied","Data":"ccdc6f88c0a706a4075e9a2f3b1019e296ec3e6bf1f59098e35f69eeb4a5fb74"} Apr 20 19:29:56.526523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.526504 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" event={"ID":"85d30de4-f22c-4375-acb1-005a6556895f","Type":"ContainerStarted","Data":"b15f3b0651812b80844ecdee620ea2c1d1cb3c1e5ddf67428c74fc555e4e184f"} Apr 20 19:29:56.528704 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.528576 2580 generic.go:358] "Generic (PLEG): container finished" podID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerID="1d591fb3b216b3038982f2ea854ba3f4f346219c03cda0643df216b7af604680" exitCode=0 Apr 20 19:29:56.528704 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.528661 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" event={"ID":"eb1ca523-7b5c-48c1-91cf-66352befe38a","Type":"ContainerDied","Data":"1d591fb3b216b3038982f2ea854ba3f4f346219c03cda0643df216b7af604680"} Apr 20 19:29:56.565152 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:56.565125 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj"] Apr 20 19:29:56.566677 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:29:56.566647 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2dbce2_0997_48f9_b99e_7f49e643677c.slice/crio-fcf6c44c94923861b76ca5f767327c13eef62598a714d757f14b0da7ed2d49b8 WatchSource:0}: Error finding container fcf6c44c94923861b76ca5f767327c13eef62598a714d757f14b0da7ed2d49b8: Status 404 returned error can't find the container with id fcf6c44c94923861b76ca5f767327c13eef62598a714d757f14b0da7ed2d49b8 Apr 20 19:29:57.431244 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.431211 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d8566fcc-8d2rq"] Apr 20 19:29:57.434845 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.434818 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.449336 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.449308 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d8566fcc-8d2rq"] Apr 20 19:29:57.515532 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.515495 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-serving-cert\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.515532 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.515534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-oauth-config\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.515743 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.515598 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfsqk\" (UniqueName: \"kubernetes.io/projected/549fb4c6-a797-44cc-ad04-c044daa57e7f-kube-api-access-qfsqk\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.515743 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.515636 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-config\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.515743 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.515660 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-service-ca\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.515743 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.515732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-oauth-serving-cert\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.515869 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.515754 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-trusted-ca-bundle\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.534816 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.534740 2580 generic.go:358] "Generic (PLEG): container finished" podID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerID="f1a35d5358f7ed5a5b7711edb335c2e869fdb85ad5588d4807284d4568408157" exitCode=0 Apr 20 19:29:57.534960 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.534827 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" event={"ID":"0f2dbce2-0997-48f9-b99e-7f49e643677c","Type":"ContainerDied","Data":"f1a35d5358f7ed5a5b7711edb335c2e869fdb85ad5588d4807284d4568408157"} Apr 20 19:29:57.534960 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.534859 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" event={"ID":"0f2dbce2-0997-48f9-b99e-7f49e643677c","Type":"ContainerStarted","Data":"fcf6c44c94923861b76ca5f767327c13eef62598a714d757f14b0da7ed2d49b8"} Apr 20 19:29:57.536875 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.536846 2580 generic.go:358] "Generic (PLEG): container finished" podID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerID="4ff46bcb9840ebc8301ae5979216977840dfc2ec0ac9d11841335272c954f587" exitCode=0 Apr 20 19:29:57.536988 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.536920 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" event={"ID":"40afe44c-fd24-405c-9d93-1ddd9db818d2","Type":"ContainerDied","Data":"4ff46bcb9840ebc8301ae5979216977840dfc2ec0ac9d11841335272c954f587"} Apr 20 19:29:57.538768 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.538746 2580 generic.go:358] "Generic (PLEG): container finished" podID="85d30de4-f22c-4375-acb1-005a6556895f" containerID="859d117daf128fe4009c0c93e65baeab010dd26803c10492ff93e84a7305359b" exitCode=0 Apr 20 19:29:57.538870 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.538840 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" event={"ID":"85d30de4-f22c-4375-acb1-005a6556895f","Type":"ContainerDied","Data":"859d117daf128fe4009c0c93e65baeab010dd26803c10492ff93e84a7305359b"} Apr 20 19:29:57.541066 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.541045 2580 generic.go:358] "Generic (PLEG): container finished" podID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerID="bbc27435da8ee1d05aedf775179b9c7ad49c9ca633a79f38a56642ec74878017" exitCode=0 Apr 20 19:29:57.541174 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.541077 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" event={"ID":"eb1ca523-7b5c-48c1-91cf-66352befe38a","Type":"ContainerDied","Data":"bbc27435da8ee1d05aedf775179b9c7ad49c9ca633a79f38a56642ec74878017"} Apr 20 19:29:57.617224 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.617189 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-serving-cert\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.617224 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.617227 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-oauth-config\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.617485 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.617266 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfsqk\" (UniqueName: \"kubernetes.io/projected/549fb4c6-a797-44cc-ad04-c044daa57e7f-kube-api-access-qfsqk\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.617605 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.617582 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-config\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.617692 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.617664 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-service-ca\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.619203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.617785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-oauth-serving-cert\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.619203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.618188 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-trusted-ca-bundle\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.619203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.618285 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-config\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.619203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.618560 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-service-ca\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.619203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.619161 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-oauth-serving-cert\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.619203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.619175 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/549fb4c6-a797-44cc-ad04-c044daa57e7f-trusted-ca-bundle\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.620366 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.620343 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-serving-cert\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.621165 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.621141 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/549fb4c6-a797-44cc-ad04-c044daa57e7f-console-oauth-config\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.624777 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.624728 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfsqk\" (UniqueName: \"kubernetes.io/projected/549fb4c6-a797-44cc-ad04-c044daa57e7f-kube-api-access-qfsqk\") pod \"console-d8566fcc-8d2rq\" (UID: \"549fb4c6-a797-44cc-ad04-c044daa57e7f\") " pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.784921 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.784825 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:29:57.912931 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:57.912906 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d8566fcc-8d2rq"] Apr 20 19:29:57.914161 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:29:57.914136 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549fb4c6_a797_44cc_ad04_c044daa57e7f.slice/crio-ba3fd8e3882c1f57c922f1e34819469b871eb02f77e5bb0a59dcfc8b06f42e9a WatchSource:0}: Error finding container ba3fd8e3882c1f57c922f1e34819469b871eb02f77e5bb0a59dcfc8b06f42e9a: Status 404 returned error can't find the container with id ba3fd8e3882c1f57c922f1e34819469b871eb02f77e5bb0a59dcfc8b06f42e9a Apr 20 19:29:58.547322 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.547207 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8566fcc-8d2rq" event={"ID":"549fb4c6-a797-44cc-ad04-c044daa57e7f","Type":"ContainerStarted","Data":"d94d73ca6a19700599f169131f3abca60704c6e1f40c4a3d18195b183863457a"} Apr 20 19:29:58.547322 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.547243 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d8566fcc-8d2rq" event={"ID":"549fb4c6-a797-44cc-ad04-c044daa57e7f","Type":"ContainerStarted","Data":"ba3fd8e3882c1f57c922f1e34819469b871eb02f77e5bb0a59dcfc8b06f42e9a"} Apr 20 19:29:58.549216 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.549193 2580 generic.go:358] "Generic (PLEG): container finished" podID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerID="fe8decf278f839ea9aceaf82c29e642c6894b228a17abf0a6181ff6b442499cf" exitCode=0 Apr 20 19:29:58.549364 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.549268 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" event={"ID":"40afe44c-fd24-405c-9d93-1ddd9db818d2","Type":"ContainerDied","Data":"fe8decf278f839ea9aceaf82c29e642c6894b228a17abf0a6181ff6b442499cf"} Apr 20 19:29:58.551124 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.551103 2580 generic.go:358] "Generic (PLEG): container finished" podID="85d30de4-f22c-4375-acb1-005a6556895f" containerID="f26e4795d7dade1ff3617803856a0583b43be0ac39d5add3e178e6c484a98cfc" exitCode=0 Apr 20 19:29:58.551239 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.551183 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" event={"ID":"85d30de4-f22c-4375-acb1-005a6556895f","Type":"ContainerDied","Data":"f26e4795d7dade1ff3617803856a0583b43be0ac39d5add3e178e6c484a98cfc"} Apr 20 19:29:58.552772 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.552755 2580 generic.go:358] "Generic (PLEG): container finished" podID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerID="ed508c116b3b82adbe0065b8ce10905dbebf830ff715320cad7d224f7f0b1cdd" exitCode=0 Apr 20 19:29:58.552863 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.552833 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" event={"ID":"0f2dbce2-0997-48f9-b99e-7f49e643677c","Type":"ContainerDied","Data":"ed508c116b3b82adbe0065b8ce10905dbebf830ff715320cad7d224f7f0b1cdd"} Apr 20 19:29:58.564951 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.564902 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d8566fcc-8d2rq" podStartSLOduration=1.564885643 podStartE2EDuration="1.564885643s" podCreationTimestamp="2026-04-20 19:29:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:29:58.563336417 +0000 UTC m=+565.718886236" watchObservedRunningTime="2026-04-20 19:29:58.564885643 +0000 UTC m=+565.720435464" Apr 20 19:29:58.682380 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.682357 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:58.729701 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.729668 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-util\") pod \"eb1ca523-7b5c-48c1-91cf-66352befe38a\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " Apr 20 19:29:58.729701 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.729705 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxmhn\" (UniqueName: \"kubernetes.io/projected/eb1ca523-7b5c-48c1-91cf-66352befe38a-kube-api-access-nxmhn\") pod \"eb1ca523-7b5c-48c1-91cf-66352befe38a\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " Apr 20 19:29:58.729923 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.729737 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-bundle\") pod \"eb1ca523-7b5c-48c1-91cf-66352befe38a\" (UID: \"eb1ca523-7b5c-48c1-91cf-66352befe38a\") " Apr 20 19:29:58.730319 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.730268 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-bundle" (OuterVolumeSpecName: "bundle") pod "eb1ca523-7b5c-48c1-91cf-66352befe38a" (UID: "eb1ca523-7b5c-48c1-91cf-66352befe38a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:58.732200 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.732180 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb1ca523-7b5c-48c1-91cf-66352befe38a-kube-api-access-nxmhn" (OuterVolumeSpecName: "kube-api-access-nxmhn") pod "eb1ca523-7b5c-48c1-91cf-66352befe38a" (UID: "eb1ca523-7b5c-48c1-91cf-66352befe38a"). InnerVolumeSpecName "kube-api-access-nxmhn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:58.734815 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.734780 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-util" (OuterVolumeSpecName: "util") pod "eb1ca523-7b5c-48c1-91cf-66352befe38a" (UID: "eb1ca523-7b5c-48c1-91cf-66352befe38a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:58.830687 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.830593 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:29:58.830687 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.830623 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxmhn\" (UniqueName: \"kubernetes.io/projected/eb1ca523-7b5c-48c1-91cf-66352befe38a-kube-api-access-nxmhn\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:29:58.830687 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:58.830634 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb1ca523-7b5c-48c1-91cf-66352befe38a-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:29:59.558325 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.558287 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" event={"ID":"eb1ca523-7b5c-48c1-91cf-66352befe38a","Type":"ContainerDied","Data":"4d409a7a697b56873e834e7c2d67870c710e9a42ddfe63d11102ec0dd798f55c"} Apr 20 19:29:59.558325 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.558314 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn" Apr 20 19:29:59.558325 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.558330 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d409a7a697b56873e834e7c2d67870c710e9a42ddfe63d11102ec0dd798f55c" Apr 20 19:29:59.560202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.560179 2580 generic.go:358] "Generic (PLEG): container finished" podID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerID="b34ad9df20108f53bc7e5d262b5079efbd795d7dd03b6e0f07981a656d243e8b" exitCode=0 Apr 20 19:29:59.560336 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.560233 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" event={"ID":"0f2dbce2-0997-48f9-b99e-7f49e643677c","Type":"ContainerDied","Data":"b34ad9df20108f53bc7e5d262b5079efbd795d7dd03b6e0f07981a656d243e8b"} Apr 20 19:29:59.719163 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.719139 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:29:59.723349 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.723323 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:29:59.841690 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.841612 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-bundle\") pod \"40afe44c-fd24-405c-9d93-1ddd9db818d2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " Apr 20 19:29:59.841842 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.841709 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-util\") pod \"85d30de4-f22c-4375-acb1-005a6556895f\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " Apr 20 19:29:59.841842 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.841736 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w4pw\" (UniqueName: \"kubernetes.io/projected/85d30de4-f22c-4375-acb1-005a6556895f-kube-api-access-5w4pw\") pod \"85d30de4-f22c-4375-acb1-005a6556895f\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " Apr 20 19:29:59.841842 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.841765 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-util\") pod \"40afe44c-fd24-405c-9d93-1ddd9db818d2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " Apr 20 19:29:59.841842 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.841796 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9ml7\" (UniqueName: \"kubernetes.io/projected/40afe44c-fd24-405c-9d93-1ddd9db818d2-kube-api-access-q9ml7\") pod \"40afe44c-fd24-405c-9d93-1ddd9db818d2\" (UID: \"40afe44c-fd24-405c-9d93-1ddd9db818d2\") " Apr 20 19:29:59.842060 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.841863 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-bundle\") pod \"85d30de4-f22c-4375-acb1-005a6556895f\" (UID: \"85d30de4-f22c-4375-acb1-005a6556895f\") " Apr 20 19:29:59.842447 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.842416 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-bundle" (OuterVolumeSpecName: "bundle") pod "85d30de4-f22c-4375-acb1-005a6556895f" (UID: "85d30de4-f22c-4375-acb1-005a6556895f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:59.842569 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.842416 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-bundle" (OuterVolumeSpecName: "bundle") pod "40afe44c-fd24-405c-9d93-1ddd9db818d2" (UID: "40afe44c-fd24-405c-9d93-1ddd9db818d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:59.844178 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.844151 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40afe44c-fd24-405c-9d93-1ddd9db818d2-kube-api-access-q9ml7" (OuterVolumeSpecName: "kube-api-access-q9ml7") pod "40afe44c-fd24-405c-9d93-1ddd9db818d2" (UID: "40afe44c-fd24-405c-9d93-1ddd9db818d2"). InnerVolumeSpecName "kube-api-access-q9ml7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:59.844415 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.844398 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d30de4-f22c-4375-acb1-005a6556895f-kube-api-access-5w4pw" (OuterVolumeSpecName: "kube-api-access-5w4pw") pod "85d30de4-f22c-4375-acb1-005a6556895f" (UID: "85d30de4-f22c-4375-acb1-005a6556895f"). InnerVolumeSpecName "kube-api-access-5w4pw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:59.848093 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.848072 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-util" (OuterVolumeSpecName: "util") pod "40afe44c-fd24-405c-9d93-1ddd9db818d2" (UID: "40afe44c-fd24-405c-9d93-1ddd9db818d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:59.849626 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.849604 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-util" (OuterVolumeSpecName: "util") pod "85d30de4-f22c-4375-acb1-005a6556895f" (UID: "85d30de4-f22c-4375-acb1-005a6556895f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:59.943385 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.943348 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:29:59.943385 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.943378 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q9ml7\" (UniqueName: \"kubernetes.io/projected/40afe44c-fd24-405c-9d93-1ddd9db818d2-kube-api-access-q9ml7\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:29:59.943385 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.943389 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:29:59.943594 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.943400 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40afe44c-fd24-405c-9d93-1ddd9db818d2-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:29:59.943594 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.943408 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85d30de4-f22c-4375-acb1-005a6556895f-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:29:59.943594 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:29:59.943416 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5w4pw\" (UniqueName: \"kubernetes.io/projected/85d30de4-f22c-4375-acb1-005a6556895f-kube-api-access-5w4pw\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:00.573184 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.573149 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" event={"ID":"40afe44c-fd24-405c-9d93-1ddd9db818d2","Type":"ContainerDied","Data":"75d0449ad434882addd78323a1c28318b76e00e373af944160708142fda1e8a0"} Apr 20 19:30:00.573184 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.573186 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75d0449ad434882addd78323a1c28318b76e00e373af944160708142fda1e8a0" Apr 20 19:30:00.573626 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.573185 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2" Apr 20 19:30:00.575023 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.575000 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" Apr 20 19:30:00.575023 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.575008 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js" event={"ID":"85d30de4-f22c-4375-acb1-005a6556895f","Type":"ContainerDied","Data":"b15f3b0651812b80844ecdee620ea2c1d1cb3c1e5ddf67428c74fc555e4e184f"} Apr 20 19:30:00.575023 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.575030 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b15f3b0651812b80844ecdee620ea2c1d1cb3c1e5ddf67428c74fc555e4e184f" Apr 20 19:30:00.700389 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.700365 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:30:00.750468 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.750433 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwl7v\" (UniqueName: \"kubernetes.io/projected/0f2dbce2-0997-48f9-b99e-7f49e643677c-kube-api-access-hwl7v\") pod \"0f2dbce2-0997-48f9-b99e-7f49e643677c\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " Apr 20 19:30:00.750618 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.750550 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-util\") pod \"0f2dbce2-0997-48f9-b99e-7f49e643677c\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " Apr 20 19:30:00.750682 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.750637 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-bundle\") pod \"0f2dbce2-0997-48f9-b99e-7f49e643677c\" (UID: \"0f2dbce2-0997-48f9-b99e-7f49e643677c\") " Apr 20 19:30:00.751112 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.751087 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-bundle" (OuterVolumeSpecName: "bundle") pod "0f2dbce2-0997-48f9-b99e-7f49e643677c" (UID: "0f2dbce2-0997-48f9-b99e-7f49e643677c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:30:00.752601 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.752572 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2dbce2-0997-48f9-b99e-7f49e643677c-kube-api-access-hwl7v" (OuterVolumeSpecName: "kube-api-access-hwl7v") pod "0f2dbce2-0997-48f9-b99e-7f49e643677c" (UID: "0f2dbce2-0997-48f9-b99e-7f49e643677c"). InnerVolumeSpecName "kube-api-access-hwl7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:30:00.756774 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.756747 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-util" (OuterVolumeSpecName: "util") pod "0f2dbce2-0997-48f9-b99e-7f49e643677c" (UID: "0f2dbce2-0997-48f9-b99e-7f49e643677c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:30:00.852110 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.852018 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwl7v\" (UniqueName: \"kubernetes.io/projected/0f2dbce2-0997-48f9-b99e-7f49e643677c-kube-api-access-hwl7v\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:00.852110 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.852053 2580 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-util\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:00.852110 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:00.852066 2580 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0f2dbce2-0997-48f9-b99e-7f49e643677c-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:01.580118 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:01.580090 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" Apr 20 19:30:01.580475 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:01.580101 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj" event={"ID":"0f2dbce2-0997-48f9-b99e-7f49e643677c","Type":"ContainerDied","Data":"fcf6c44c94923861b76ca5f767327c13eef62598a714d757f14b0da7ed2d49b8"} Apr 20 19:30:01.580475 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:01.580146 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf6c44c94923861b76ca5f767327c13eef62598a714d757f14b0da7ed2d49b8" Apr 20 19:30:07.785433 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:07.785389 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:30:07.785895 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:07.785465 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:30:07.790199 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:07.790176 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:30:08.612763 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:08.612724 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d8566fcc-8d2rq" Apr 20 19:30:08.667456 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:08.667421 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6689b8cd74-x72mb"] Apr 20 19:30:19.700137 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700105 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm"] Apr 20 19:30:19.700580 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700539 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerName="util" Apr 20 19:30:19.700580 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700555 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerName="util" Apr 20 19:30:19.700580 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700566 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerName="pull" Apr 20 19:30:19.700580 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700572 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerName="pull" Apr 20 19:30:19.700580 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700579 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85d30de4-f22c-4375-acb1-005a6556895f" containerName="extract" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700584 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d30de4-f22c-4375-acb1-005a6556895f" containerName="extract" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700593 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerName="util" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700598 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerName="util" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700605 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerName="pull" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700610 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerName="pull" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700618 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerName="extract" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700626 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerName="extract" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700645 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerName="util" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700650 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerName="util" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700656 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85d30de4-f22c-4375-acb1-005a6556895f" containerName="pull" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700661 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d30de4-f22c-4375-acb1-005a6556895f" containerName="pull" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700667 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerName="pull" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700672 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerName="pull" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700692 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85d30de4-f22c-4375-acb1-005a6556895f" containerName="util" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700697 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d30de4-f22c-4375-acb1-005a6556895f" containerName="util" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700707 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerName="extract" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700718 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerName="extract" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700725 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerName="extract" Apr 20 19:30:19.700759 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700730 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerName="extract" Apr 20 19:30:19.701316 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700793 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="40afe44c-fd24-405c-9d93-1ddd9db818d2" containerName="extract" Apr 20 19:30:19.701316 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700804 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="85d30de4-f22c-4375-acb1-005a6556895f" containerName="extract" Apr 20 19:30:19.701316 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700811 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb1ca523-7b5c-48c1-91cf-66352befe38a" containerName="extract" Apr 20 19:30:19.701316 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.700820 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f2dbce2-0997-48f9-b99e-7f49e643677c" containerName="extract" Apr 20 19:30:19.704929 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.704908 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:19.707551 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.707530 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-sxkjt\"" Apr 20 19:30:19.715357 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.715332 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm"] Apr 20 19:30:19.825049 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.825025 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e2afac6-60bc-4770-8da4-f6a9520955f9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" (UID: \"9e2afac6-60bc-4770-8da4-f6a9520955f9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:19.825191 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.825097 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkzc8\" (UniqueName: \"kubernetes.io/projected/9e2afac6-60bc-4770-8da4-f6a9520955f9-kube-api-access-nkzc8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" (UID: \"9e2afac6-60bc-4770-8da4-f6a9520955f9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:19.926203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.926168 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkzc8\" (UniqueName: \"kubernetes.io/projected/9e2afac6-60bc-4770-8da4-f6a9520955f9-kube-api-access-nkzc8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" (UID: \"9e2afac6-60bc-4770-8da4-f6a9520955f9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:19.926448 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.926303 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e2afac6-60bc-4770-8da4-f6a9520955f9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" (UID: \"9e2afac6-60bc-4770-8da4-f6a9520955f9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:19.926702 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.926683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e2afac6-60bc-4770-8da4-f6a9520955f9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" (UID: \"9e2afac6-60bc-4770-8da4-f6a9520955f9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:19.937669 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:19.937649 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkzc8\" (UniqueName: \"kubernetes.io/projected/9e2afac6-60bc-4770-8da4-f6a9520955f9-kube-api-access-nkzc8\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" (UID: \"9e2afac6-60bc-4770-8da4-f6a9520955f9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:20.016529 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:20.016506 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:20.148520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:20.148495 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm"] Apr 20 19:30:20.150719 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:30:20.150691 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e2afac6_60bc_4770_8da4_f6a9520955f9.slice/crio-810093291bcbc18395364b1ebf089f353de61b5c7fb1a30c1ec652e6ada02ae9 WatchSource:0}: Error finding container 810093291bcbc18395364b1ebf089f353de61b5c7fb1a30c1ec652e6ada02ae9: Status 404 returned error can't find the container with id 810093291bcbc18395364b1ebf089f353de61b5c7fb1a30c1ec652e6ada02ae9 Apr 20 19:30:20.656727 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:20.656675 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" event={"ID":"9e2afac6-60bc-4770-8da4-f6a9520955f9","Type":"ContainerStarted","Data":"810093291bcbc18395364b1ebf089f353de61b5c7fb1a30c1ec652e6ada02ae9"} Apr 20 19:30:25.678926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:25.678883 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" event={"ID":"9e2afac6-60bc-4770-8da4-f6a9520955f9","Type":"ContainerStarted","Data":"cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60"} Apr 20 19:30:25.679460 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:25.678984 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:25.704190 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:25.704131 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" podStartSLOduration=1.933283269 podStartE2EDuration="6.704111976s" podCreationTimestamp="2026-04-20 19:30:19 +0000 UTC" firstStartedPulling="2026-04-20 19:30:20.153068266 +0000 UTC m=+587.308618063" lastFinishedPulling="2026-04-20 19:30:24.923896972 +0000 UTC m=+592.079446770" observedRunningTime="2026-04-20 19:30:25.703657323 +0000 UTC m=+592.859207142" watchObservedRunningTime="2026-04-20 19:30:25.704111976 +0000 UTC m=+592.859661797" Apr 20 19:30:33.505010 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.504982 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:30:33.505507 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.504982 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:30:33.511577 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.511555 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:30:33.511709 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.511679 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:30:33.691689 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.691568 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6689b8cd74-x72mb" podUID="d6aa5975-60c4-418b-baac-3401036ab231" containerName="console" containerID="cri-o://62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d" gracePeriod=15 Apr 20 19:30:33.945854 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.945827 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6689b8cd74-x72mb_d6aa5975-60c4-418b-baac-3401036ab231/console/0.log" Apr 20 19:30:33.945977 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.945889 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:30:33.957724 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.957701 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-console-config\") pod \"d6aa5975-60c4-418b-baac-3401036ab231\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " Apr 20 19:30:33.957819 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.957733 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6fql\" (UniqueName: \"kubernetes.io/projected/d6aa5975-60c4-418b-baac-3401036ab231-kube-api-access-c6fql\") pod \"d6aa5975-60c4-418b-baac-3401036ab231\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " Apr 20 19:30:33.957819 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.957765 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-serving-cert\") pod \"d6aa5975-60c4-418b-baac-3401036ab231\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " Apr 20 19:30:33.957819 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.957790 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-oauth-serving-cert\") pod \"d6aa5975-60c4-418b-baac-3401036ab231\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " Apr 20 19:30:33.957819 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.957806 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-trusted-ca-bundle\") pod \"d6aa5975-60c4-418b-baac-3401036ab231\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " Apr 20 19:30:33.958009 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.957864 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-service-ca\") pod \"d6aa5975-60c4-418b-baac-3401036ab231\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " Apr 20 19:30:33.958009 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.957934 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-oauth-config\") pod \"d6aa5975-60c4-418b-baac-3401036ab231\" (UID: \"d6aa5975-60c4-418b-baac-3401036ab231\") " Apr 20 19:30:33.958227 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.958185 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d6aa5975-60c4-418b-baac-3401036ab231" (UID: "d6aa5975-60c4-418b-baac-3401036ab231"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:30:33.958356 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.958219 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-console-config" (OuterVolumeSpecName: "console-config") pod "d6aa5975-60c4-418b-baac-3401036ab231" (UID: "d6aa5975-60c4-418b-baac-3401036ab231"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:30:33.958356 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.958321 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d6aa5975-60c4-418b-baac-3401036ab231" (UID: "d6aa5975-60c4-418b-baac-3401036ab231"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:30:33.958454 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.958346 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-service-ca" (OuterVolumeSpecName: "service-ca") pod "d6aa5975-60c4-418b-baac-3401036ab231" (UID: "d6aa5975-60c4-418b-baac-3401036ab231"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:30:33.959968 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.959946 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d6aa5975-60c4-418b-baac-3401036ab231" (UID: "d6aa5975-60c4-418b-baac-3401036ab231"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:30:33.960109 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.960091 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d6aa5975-60c4-418b-baac-3401036ab231" (UID: "d6aa5975-60c4-418b-baac-3401036ab231"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:30:33.960200 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:33.960182 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6aa5975-60c4-418b-baac-3401036ab231-kube-api-access-c6fql" (OuterVolumeSpecName: "kube-api-access-c6fql") pod "d6aa5975-60c4-418b-baac-3401036ab231" (UID: "d6aa5975-60c4-418b-baac-3401036ab231"). InnerVolumeSpecName "kube-api-access-c6fql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:30:34.059056 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.059017 2580 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-oauth-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:34.059056 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.059049 2580 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-console-config\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:34.059056 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.059063 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c6fql\" (UniqueName: \"kubernetes.io/projected/d6aa5975-60c4-418b-baac-3401036ab231-kube-api-access-c6fql\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:34.059354 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.059077 2580 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6aa5975-60c4-418b-baac-3401036ab231-console-serving-cert\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:34.059354 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.059089 2580 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-oauth-serving-cert\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:34.059354 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.059101 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-trusted-ca-bundle\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:34.059354 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.059112 2580 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6aa5975-60c4-418b-baac-3401036ab231-service-ca\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:34.715492 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.715463 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6689b8cd74-x72mb_d6aa5975-60c4-418b-baac-3401036ab231/console/0.log" Apr 20 19:30:34.715996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.715504 2580 generic.go:358] "Generic (PLEG): container finished" podID="d6aa5975-60c4-418b-baac-3401036ab231" containerID="62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d" exitCode=2 Apr 20 19:30:34.715996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.715532 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6689b8cd74-x72mb" event={"ID":"d6aa5975-60c4-418b-baac-3401036ab231","Type":"ContainerDied","Data":"62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d"} Apr 20 19:30:34.715996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.715554 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6689b8cd74-x72mb" event={"ID":"d6aa5975-60c4-418b-baac-3401036ab231","Type":"ContainerDied","Data":"abd7442d8acf1cc684abbe013f18785951660f2841e9ce5d97218a9a11c09a34"} Apr 20 19:30:34.715996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.715568 2580 scope.go:117] "RemoveContainer" containerID="62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d" Apr 20 19:30:34.715996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.715587 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6689b8cd74-x72mb" Apr 20 19:30:34.725181 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.725163 2580 scope.go:117] "RemoveContainer" containerID="62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d" Apr 20 19:30:34.725532 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:30:34.725509 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d\": container with ID starting with 62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d not found: ID does not exist" containerID="62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d" Apr 20 19:30:34.725613 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.725541 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d"} err="failed to get container status \"62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d\": rpc error: code = NotFound desc = could not find container \"62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d\": container with ID starting with 62a8c83326066b8f2dd4d570b69838d1e420e98ccb0f9313d4d4c1cf4f7f8d4d not found: ID does not exist" Apr 20 19:30:34.739735 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.739706 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6689b8cd74-x72mb"] Apr 20 19:30:34.742015 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:34.741996 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6689b8cd74-x72mb"] Apr 20 19:30:35.553164 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:35.553131 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6aa5975-60c4-418b-baac-3401036ab231" path="/var/lib/kubelet/pods/d6aa5975-60c4-418b-baac-3401036ab231/volumes" Apr 20 19:30:36.684135 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:36.684105 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:37.768034 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.768003 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp"] Apr 20 19:30:37.768442 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.768427 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6aa5975-60c4-418b-baac-3401036ab231" containerName="console" Apr 20 19:30:37.768506 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.768443 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6aa5975-60c4-418b-baac-3401036ab231" containerName="console" Apr 20 19:30:37.768548 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.768521 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6aa5975-60c4-418b-baac-3401036ab231" containerName="console" Apr 20 19:30:37.773134 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.773116 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:37.788956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.788924 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp"] Apr 20 19:30:37.792392 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.792364 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/407506dc-510d-4722-8c05-01fb02fbba8c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" (UID: \"407506dc-510d-4722-8c05-01fb02fbba8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:37.792533 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.792411 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd9v6\" (UniqueName: \"kubernetes.io/projected/407506dc-510d-4722-8c05-01fb02fbba8c-kube-api-access-cd9v6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" (UID: \"407506dc-510d-4722-8c05-01fb02fbba8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:37.892859 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.892827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/407506dc-510d-4722-8c05-01fb02fbba8c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" (UID: \"407506dc-510d-4722-8c05-01fb02fbba8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:37.893056 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.892873 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd9v6\" (UniqueName: \"kubernetes.io/projected/407506dc-510d-4722-8c05-01fb02fbba8c-kube-api-access-cd9v6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" (UID: \"407506dc-510d-4722-8c05-01fb02fbba8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:37.893297 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.893240 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/407506dc-510d-4722-8c05-01fb02fbba8c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" (UID: \"407506dc-510d-4722-8c05-01fb02fbba8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:37.906303 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:37.906276 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd9v6\" (UniqueName: \"kubernetes.io/projected/407506dc-510d-4722-8c05-01fb02fbba8c-kube-api-access-cd9v6\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" (UID: \"407506dc-510d-4722-8c05-01fb02fbba8c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:38.083516 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.083434 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:38.211305 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.211279 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp"] Apr 20 19:30:38.214054 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:30:38.214023 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod407506dc_510d_4722_8c05_01fb02fbba8c.slice/crio-d9407b658617cd0a1e0ba00c6a6c81f65a1f2d9cccc0ee71a49e0edf5ec64912 WatchSource:0}: Error finding container d9407b658617cd0a1e0ba00c6a6c81f65a1f2d9cccc0ee71a49e0edf5ec64912: Status 404 returned error can't find the container with id d9407b658617cd0a1e0ba00c6a6c81f65a1f2d9cccc0ee71a49e0edf5ec64912 Apr 20 19:30:38.467607 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.467524 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm"] Apr 20 19:30:38.467769 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.467741 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" containerName="manager" containerID="cri-o://cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60" gracePeriod=2 Apr 20 19:30:38.469909 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.469871 2580 status_manager.go:895] "Failed to get status for pod" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:38.471543 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.471520 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm"] Apr 20 19:30:38.490911 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.490857 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp"] Apr 20 19:30:38.496234 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.496205 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4"] Apr 20 19:30:38.496858 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.496839 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" containerName="manager" Apr 20 19:30:38.496954 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.496861 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" containerName="manager" Apr 20 19:30:38.496954 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.496949 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" containerName="manager" Apr 20 19:30:38.500388 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.500369 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp"] Apr 20 19:30:38.500489 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.500478 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:38.507862 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.507302 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4"] Apr 20 19:30:38.517501 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.517477 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf"] Apr 20 19:30:38.518054 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.518023 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" containerName="manager" Apr 20 19:30:38.518054 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.518046 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" containerName="manager" Apr 20 19:30:38.518201 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.518133 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" containerName="manager" Apr 20 19:30:38.521504 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.521484 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:38.527422 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.527392 2580 status_manager.go:895] "Failed to get status for pod" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:38.529340 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.529307 2580 status_manager.go:895] "Failed to get status for pod" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:38.536891 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.536866 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf"] Apr 20 19:30:38.600059 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.600028 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nbhw\" (UniqueName: \"kubernetes.io/projected/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-kube-api-access-5nbhw\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf\" (UID: \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:38.600195 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.600164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqdzw\" (UniqueName: \"kubernetes.io/projected/69cad7e9-3ac6-4579-9920-4ef00503a30b-kube-api-access-vqdzw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wcdq4\" (UID: \"69cad7e9-3ac6-4579-9920-4ef00503a30b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:38.600244 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.600209 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf\" (UID: \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:38.600244 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.600233 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/69cad7e9-3ac6-4579-9920-4ef00503a30b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wcdq4\" (UID: \"69cad7e9-3ac6-4579-9920-4ef00503a30b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:38.694383 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.694358 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:38.696323 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.696295 2580 status_manager.go:895] "Failed to get status for pod" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:38.701705 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.701680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqdzw\" (UniqueName: \"kubernetes.io/projected/69cad7e9-3ac6-4579-9920-4ef00503a30b-kube-api-access-vqdzw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wcdq4\" (UID: \"69cad7e9-3ac6-4579-9920-4ef00503a30b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:38.701819 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.701721 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf\" (UID: \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:38.701819 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.701740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/69cad7e9-3ac6-4579-9920-4ef00503a30b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wcdq4\" (UID: \"69cad7e9-3ac6-4579-9920-4ef00503a30b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:38.701819 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.701772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nbhw\" (UniqueName: \"kubernetes.io/projected/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-kube-api-access-5nbhw\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf\" (UID: \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:38.702117 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.702097 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf\" (UID: \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:38.702187 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.702097 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/69cad7e9-3ac6-4579-9920-4ef00503a30b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wcdq4\" (UID: \"69cad7e9-3ac6-4579-9920-4ef00503a30b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:38.711997 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.711976 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nbhw\" (UniqueName: \"kubernetes.io/projected/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-kube-api-access-5nbhw\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf\" (UID: \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:38.715920 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.715899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqdzw\" (UniqueName: \"kubernetes.io/projected/69cad7e9-3ac6-4579-9920-4ef00503a30b-kube-api-access-vqdzw\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wcdq4\" (UID: \"69cad7e9-3ac6-4579-9920-4ef00503a30b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:38.734318 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:30:38.734240 2580 kuberuntime_manager.go:623] "Missing actuated resource record" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" container="manager" Apr 20 19:30:38.735412 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.735390 2580 generic.go:358] "Generic (PLEG): container finished" podID="9e2afac6-60bc-4770-8da4-f6a9520955f9" containerID="cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60" exitCode=0 Apr 20 19:30:38.735525 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.735442 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" Apr 20 19:30:38.735525 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.735463 2580 scope.go:117] "RemoveContainer" containerID="cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60" Apr 20 19:30:38.739118 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.739092 2580 status_manager.go:895] "Failed to get status for pod" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:38.741146 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.741124 2580 status_manager.go:895] "Failed to get status for pod" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:38.742796 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.742771 2580 status_manager.go:895] "Failed to get status for pod" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:38.744534 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.744511 2580 status_manager.go:895] "Failed to get status for pod" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:38.745852 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.745838 2580 scope.go:117] "RemoveContainer" containerID="cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60" Apr 20 19:30:38.746140 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:30:38.746124 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60\": container with ID starting with cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60 not found: ID does not exist" containerID="cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60" Apr 20 19:30:38.746203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.746149 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60"} err="failed to get container status \"cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60\": rpc error: code = NotFound desc = could not find container \"cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60\": container with ID starting with cb787ab085c2512a2007f0db66c2a53411f8639acee1923f3bd109e43e403c60 not found: ID does not exist" Apr 20 19:30:38.802235 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.802208 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e2afac6-60bc-4770-8da4-f6a9520955f9-extensions-socket-volume\") pod \"9e2afac6-60bc-4770-8da4-f6a9520955f9\" (UID: \"9e2afac6-60bc-4770-8da4-f6a9520955f9\") " Apr 20 19:30:38.802575 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.802340 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkzc8\" (UniqueName: \"kubernetes.io/projected/9e2afac6-60bc-4770-8da4-f6a9520955f9-kube-api-access-nkzc8\") pod \"9e2afac6-60bc-4770-8da4-f6a9520955f9\" (UID: \"9e2afac6-60bc-4770-8da4-f6a9520955f9\") " Apr 20 19:30:38.802630 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.802609 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e2afac6-60bc-4770-8da4-f6a9520955f9-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "9e2afac6-60bc-4770-8da4-f6a9520955f9" (UID: "9e2afac6-60bc-4770-8da4-f6a9520955f9"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:30:38.804523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.804500 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e2afac6-60bc-4770-8da4-f6a9520955f9-kube-api-access-nkzc8" (OuterVolumeSpecName: "kube-api-access-nkzc8") pod "9e2afac6-60bc-4770-8da4-f6a9520955f9" (UID: "9e2afac6-60bc-4770-8da4-f6a9520955f9"). InnerVolumeSpecName "kube-api-access-nkzc8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:30:38.847886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.847861 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:38.855627 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.855608 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:38.904656 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.904587 2580 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9e2afac6-60bc-4770-8da4-f6a9520955f9-extensions-socket-volume\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:38.904656 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:38.904622 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkzc8\" (UniqueName: \"kubernetes.io/projected/9e2afac6-60bc-4770-8da4-f6a9520955f9-kube-api-access-nkzc8\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:39.037950 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.037918 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf"] Apr 20 19:30:39.041054 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:30:39.041025 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7281bc21_19ab_4e21_a6e7_3ceb4171e7c3.slice/crio-e7520252098c06ed070d80083abd2ef32ac5e7acee84b425054b48f71a9034f4 WatchSource:0}: Error finding container e7520252098c06ed070d80083abd2ef32ac5e7acee84b425054b48f71a9034f4: Status 404 returned error can't find the container with id e7520252098c06ed070d80083abd2ef32ac5e7acee84b425054b48f71a9034f4 Apr 20 19:30:39.042298 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.042275 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4"] Apr 20 19:30:39.042984 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:30:39.042956 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69cad7e9_3ac6_4579_9920_4ef00503a30b.slice/crio-daa16efdbdc9f458ff673fd4ccaa2fed654f8323c00d5d00ec5b31245084b39d WatchSource:0}: Error finding container daa16efdbdc9f458ff673fd4ccaa2fed654f8323c00d5d00ec5b31245084b39d: Status 404 returned error can't find the container with id daa16efdbdc9f458ff673fd4ccaa2fed654f8323c00d5d00ec5b31245084b39d Apr 20 19:30:39.060674 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.060646 2580 status_manager.go:895] "Failed to get status for pod" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:39.062182 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.062153 2580 status_manager.go:895] "Failed to get status for pod" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vm4sm" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vm4sm\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:39.553193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.553155 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e2afac6-60bc-4770-8da4-f6a9520955f9" path="/var/lib/kubelet/pods/9e2afac6-60bc-4770-8da4-f6a9520955f9/volumes" Apr 20 19:30:39.741741 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.741703 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" event={"ID":"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3","Type":"ContainerStarted","Data":"dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330"} Apr 20 19:30:39.741741 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.741744 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" event={"ID":"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3","Type":"ContainerStarted","Data":"e7520252098c06ed070d80083abd2ef32ac5e7acee84b425054b48f71a9034f4"} Apr 20 19:30:39.741988 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.741766 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:39.744342 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.744305 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" containerName="manager" containerID="cri-o://d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0" gracePeriod=2 Apr 20 19:30:39.745820 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.745782 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" event={"ID":"69cad7e9-3ac6-4579-9920-4ef00503a30b","Type":"ContainerStarted","Data":"5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be"} Apr 20 19:30:39.745820 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.745818 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" event={"ID":"69cad7e9-3ac6-4579-9920-4ef00503a30b","Type":"ContainerStarted","Data":"daa16efdbdc9f458ff673fd4ccaa2fed654f8323c00d5d00ec5b31245084b39d"} Apr 20 19:30:39.745996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.745908 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:39.775651 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.775603 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" podStartSLOduration=1.775586952 podStartE2EDuration="1.775586952s" podCreationTimestamp="2026-04-20 19:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:30:39.771846701 +0000 UTC m=+606.927396519" watchObservedRunningTime="2026-04-20 19:30:39.775586952 +0000 UTC m=+606.931136770" Apr 20 19:30:39.807106 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.807011 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" podStartSLOduration=1.806996984 podStartE2EDuration="1.806996984s" podCreationTimestamp="2026-04-20 19:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:30:39.803486017 +0000 UTC m=+606.959035849" watchObservedRunningTime="2026-04-20 19:30:39.806996984 +0000 UTC m=+606.962546803" Apr 20 19:30:39.991795 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.991760 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:39.993892 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:39.993848 2580 status_manager.go:895] "Failed to get status for pod" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:40.014949 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.014916 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/407506dc-510d-4722-8c05-01fb02fbba8c-extensions-socket-volume\") pod \"407506dc-510d-4722-8c05-01fb02fbba8c\" (UID: \"407506dc-510d-4722-8c05-01fb02fbba8c\") " Apr 20 19:30:40.015091 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.015013 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd9v6\" (UniqueName: \"kubernetes.io/projected/407506dc-510d-4722-8c05-01fb02fbba8c-kube-api-access-cd9v6\") pod \"407506dc-510d-4722-8c05-01fb02fbba8c\" (UID: \"407506dc-510d-4722-8c05-01fb02fbba8c\") " Apr 20 19:30:40.015325 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.015303 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/407506dc-510d-4722-8c05-01fb02fbba8c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "407506dc-510d-4722-8c05-01fb02fbba8c" (UID: "407506dc-510d-4722-8c05-01fb02fbba8c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:30:40.017169 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.017150 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/407506dc-510d-4722-8c05-01fb02fbba8c-kube-api-access-cd9v6" (OuterVolumeSpecName: "kube-api-access-cd9v6") pod "407506dc-510d-4722-8c05-01fb02fbba8c" (UID: "407506dc-510d-4722-8c05-01fb02fbba8c"). InnerVolumeSpecName "kube-api-access-cd9v6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:30:40.116240 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.116149 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cd9v6\" (UniqueName: \"kubernetes.io/projected/407506dc-510d-4722-8c05-01fb02fbba8c-kube-api-access-cd9v6\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:40.116240 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.116179 2580 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/407506dc-510d-4722-8c05-01fb02fbba8c-extensions-socket-volume\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:40.751644 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.751605 2580 generic.go:358] "Generic (PLEG): container finished" podID="407506dc-510d-4722-8c05-01fb02fbba8c" containerID="d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0" exitCode=0 Apr 20 19:30:40.751857 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.751668 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" Apr 20 19:30:40.751857 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.751706 2580 scope.go:117] "RemoveContainer" containerID="d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0" Apr 20 19:30:40.754037 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.754010 2580 status_manager.go:895] "Failed to get status for pod" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:40.761842 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.761814 2580 status_manager.go:895] "Failed to get status for pod" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-wj9cp\" is forbidden: User \"system:node:ip-10-0-134-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-118.ec2.internal' and this object" Apr 20 19:30:40.761911 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.761888 2580 scope.go:117] "RemoveContainer" containerID="d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0" Apr 20 19:30:40.762168 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:30:40.762149 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0\": container with ID starting with d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0 not found: ID does not exist" containerID="d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0" Apr 20 19:30:40.762280 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:40.762176 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0"} err="failed to get container status \"d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0\": rpc error: code = NotFound desc = could not find container \"d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0\": container with ID starting with d808324aed7244b822e3bc0230a88b46f0ad0aa74758849296d12ad1d42532c0 not found: ID does not exist" Apr 20 19:30:41.554097 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:41.554058 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="407506dc-510d-4722-8c05-01fb02fbba8c" path="/var/lib/kubelet/pods/407506dc-510d-4722-8c05-01fb02fbba8c/volumes" Apr 20 19:30:50.753754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:50.753681 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:50.753754 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:50.753733 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:30:50.831449 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:50.831413 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4"] Apr 20 19:30:50.831664 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:50.831643 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" podUID="69cad7e9-3ac6-4579-9920-4ef00503a30b" containerName="manager" containerID="cri-o://5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be" gracePeriod=10 Apr 20 19:30:51.074093 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.074069 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:51.116894 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.116849 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqdzw\" (UniqueName: \"kubernetes.io/projected/69cad7e9-3ac6-4579-9920-4ef00503a30b-kube-api-access-vqdzw\") pod \"69cad7e9-3ac6-4579-9920-4ef00503a30b\" (UID: \"69cad7e9-3ac6-4579-9920-4ef00503a30b\") " Apr 20 19:30:51.117074 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.116900 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/69cad7e9-3ac6-4579-9920-4ef00503a30b-extensions-socket-volume\") pod \"69cad7e9-3ac6-4579-9920-4ef00503a30b\" (UID: \"69cad7e9-3ac6-4579-9920-4ef00503a30b\") " Apr 20 19:30:51.117338 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.117314 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cad7e9-3ac6-4579-9920-4ef00503a30b-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "69cad7e9-3ac6-4579-9920-4ef00503a30b" (UID: "69cad7e9-3ac6-4579-9920-4ef00503a30b"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:30:51.119527 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.119494 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69cad7e9-3ac6-4579-9920-4ef00503a30b-kube-api-access-vqdzw" (OuterVolumeSpecName: "kube-api-access-vqdzw") pod "69cad7e9-3ac6-4579-9920-4ef00503a30b" (UID: "69cad7e9-3ac6-4579-9920-4ef00503a30b"). InnerVolumeSpecName "kube-api-access-vqdzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:30:51.217706 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.217671 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vqdzw\" (UniqueName: \"kubernetes.io/projected/69cad7e9-3ac6-4579-9920-4ef00503a30b-kube-api-access-vqdzw\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:51.217706 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.217702 2580 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/69cad7e9-3ac6-4579-9920-4ef00503a30b-extensions-socket-volume\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:30:51.258152 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.258115 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr"] Apr 20 19:30:51.258552 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.258539 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69cad7e9-3ac6-4579-9920-4ef00503a30b" containerName="manager" Apr 20 19:30:51.258603 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.258555 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cad7e9-3ac6-4579-9920-4ef00503a30b" containerName="manager" Apr 20 19:30:51.258651 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.258642 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="69cad7e9-3ac6-4579-9920-4ef00503a30b" containerName="manager" Apr 20 19:30:51.261991 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.261973 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.273836 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.273804 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr"] Apr 20 19:30:51.318922 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.318824 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjv7v\" (UniqueName: \"kubernetes.io/projected/92a82465-6c14-4457-b8af-00b32e34335d-kube-api-access-rjv7v\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nmfdr\" (UID: \"92a82465-6c14-4457-b8af-00b32e34335d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.318922 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.318887 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92a82465-6c14-4457-b8af-00b32e34335d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nmfdr\" (UID: \"92a82465-6c14-4457-b8af-00b32e34335d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.419966 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.419926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92a82465-6c14-4457-b8af-00b32e34335d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nmfdr\" (UID: \"92a82465-6c14-4457-b8af-00b32e34335d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.420139 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.420025 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjv7v\" (UniqueName: \"kubernetes.io/projected/92a82465-6c14-4457-b8af-00b32e34335d-kube-api-access-rjv7v\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nmfdr\" (UID: \"92a82465-6c14-4457-b8af-00b32e34335d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.420393 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.420372 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92a82465-6c14-4457-b8af-00b32e34335d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nmfdr\" (UID: \"92a82465-6c14-4457-b8af-00b32e34335d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.434473 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.434447 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjv7v\" (UniqueName: \"kubernetes.io/projected/92a82465-6c14-4457-b8af-00b32e34335d-kube-api-access-rjv7v\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nmfdr\" (UID: \"92a82465-6c14-4457-b8af-00b32e34335d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.573478 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.573397 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.710812 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.710786 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr"] Apr 20 19:30:51.712866 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:30:51.712835 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92a82465_6c14_4457_b8af_00b32e34335d.slice/crio-4c4d523bd5597316daae3008de92969635dd5548aea4e2c9ab78181131b412fd WatchSource:0}: Error finding container 4c4d523bd5597316daae3008de92969635dd5548aea4e2c9ab78181131b412fd: Status 404 returned error can't find the container with id 4c4d523bd5597316daae3008de92969635dd5548aea4e2c9ab78181131b412fd Apr 20 19:30:51.800405 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.800370 2580 generic.go:358] "Generic (PLEG): container finished" podID="69cad7e9-3ac6-4579-9920-4ef00503a30b" containerID="5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be" exitCode=0 Apr 20 19:30:51.800837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.800445 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" event={"ID":"69cad7e9-3ac6-4579-9920-4ef00503a30b","Type":"ContainerDied","Data":"5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be"} Apr 20 19:30:51.800837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.800480 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" Apr 20 19:30:51.800837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.800492 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4" event={"ID":"69cad7e9-3ac6-4579-9920-4ef00503a30b","Type":"ContainerDied","Data":"daa16efdbdc9f458ff673fd4ccaa2fed654f8323c00d5d00ec5b31245084b39d"} Apr 20 19:30:51.800837 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.800514 2580 scope.go:117] "RemoveContainer" containerID="5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be" Apr 20 19:30:51.802457 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.802427 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" event={"ID":"92a82465-6c14-4457-b8af-00b32e34335d","Type":"ContainerStarted","Data":"3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6"} Apr 20 19:30:51.802584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.802468 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" event={"ID":"92a82465-6c14-4457-b8af-00b32e34335d","Type":"ContainerStarted","Data":"4c4d523bd5597316daae3008de92969635dd5548aea4e2c9ab78181131b412fd"} Apr 20 19:30:51.802584 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.802550 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:30:51.810349 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.810332 2580 scope.go:117] "RemoveContainer" containerID="5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be" Apr 20 19:30:51.810653 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:30:51.810632 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be\": container with ID starting with 5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be not found: ID does not exist" containerID="5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be" Apr 20 19:30:51.810715 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.810661 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be"} err="failed to get container status \"5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be\": rpc error: code = NotFound desc = could not find container \"5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be\": container with ID starting with 5c92f3ab1a6550a4034beaaf71ca5d69fb71ba4d5ea12c56f82f8318cd8df7be not found: ID does not exist" Apr 20 19:30:51.824104 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.824015 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" podStartSLOduration=0.824002491 podStartE2EDuration="824.002491ms" podCreationTimestamp="2026-04-20 19:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:30:51.821667086 +0000 UTC m=+618.977216901" watchObservedRunningTime="2026-04-20 19:30:51.824002491 +0000 UTC m=+618.979552310" Apr 20 19:30:51.846836 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.846802 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4"] Apr 20 19:30:51.851411 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:51.851375 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wcdq4"] Apr 20 19:30:53.554409 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:30:53.554377 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69cad7e9-3ac6-4579-9920-4ef00503a30b" path="/var/lib/kubelet/pods/69cad7e9-3ac6-4579-9920-4ef00503a30b/volumes" Apr 20 19:31:02.810661 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:02.810626 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:31:02.862193 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:02.862160 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf"] Apr 20 19:31:02.862472 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:02.862427 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" podUID="7281bc21-19ab-4e21-a6e7-3ceb4171e7c3" containerName="manager" containerID="cri-o://dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330" gracePeriod=10 Apr 20 19:31:03.110705 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.110682 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:31:03.232163 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.232125 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nbhw\" (UniqueName: \"kubernetes.io/projected/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-kube-api-access-5nbhw\") pod \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\" (UID: \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\") " Apr 20 19:31:03.232367 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.232193 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-extensions-socket-volume\") pod \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\" (UID: \"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3\") " Apr 20 19:31:03.232571 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.232550 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "7281bc21-19ab-4e21-a6e7-3ceb4171e7c3" (UID: "7281bc21-19ab-4e21-a6e7-3ceb4171e7c3"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:03.234408 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.234385 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-kube-api-access-5nbhw" (OuterVolumeSpecName: "kube-api-access-5nbhw") pod "7281bc21-19ab-4e21-a6e7-3ceb4171e7c3" (UID: "7281bc21-19ab-4e21-a6e7-3ceb4171e7c3"). InnerVolumeSpecName "kube-api-access-5nbhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:03.333886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.333855 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5nbhw\" (UniqueName: \"kubernetes.io/projected/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-kube-api-access-5nbhw\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:31:03.333886 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.333884 2580 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3-extensions-socket-volume\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:31:03.862925 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.862836 2580 generic.go:358] "Generic (PLEG): container finished" podID="7281bc21-19ab-4e21-a6e7-3ceb4171e7c3" containerID="dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330" exitCode=0 Apr 20 19:31:03.863369 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.862933 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" Apr 20 19:31:03.863369 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.862922 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" event={"ID":"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3","Type":"ContainerDied","Data":"dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330"} Apr 20 19:31:03.863369 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.863044 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf" event={"ID":"7281bc21-19ab-4e21-a6e7-3ceb4171e7c3","Type":"ContainerDied","Data":"e7520252098c06ed070d80083abd2ef32ac5e7acee84b425054b48f71a9034f4"} Apr 20 19:31:03.863369 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.863064 2580 scope.go:117] "RemoveContainer" containerID="dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330" Apr 20 19:31:03.872626 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.872421 2580 scope.go:117] "RemoveContainer" containerID="dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330" Apr 20 19:31:03.872733 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:31:03.872711 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330\": container with ID starting with dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330 not found: ID does not exist" containerID="dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330" Apr 20 19:31:03.872793 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.872750 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330"} err="failed to get container status \"dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330\": rpc error: code = NotFound desc = could not find container \"dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330\": container with ID starting with dcebcedac7b3b5c9f8d6cc61657c79b12e5d9fa2cdcec4f101449d17d31b1330 not found: ID does not exist" Apr 20 19:31:03.894588 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.894544 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf"] Apr 20 19:31:03.896896 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:03.896868 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kkdqf"] Apr 20 19:31:05.554177 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:05.554141 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7281bc21-19ab-4e21-a6e7-3ceb4171e7c3" path="/var/lib/kubelet/pods/7281bc21-19ab-4e21-a6e7-3ceb4171e7c3/volumes" Apr 20 19:31:07.135838 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.135804 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl"] Apr 20 19:31:07.136522 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.136495 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7281bc21-19ab-4e21-a6e7-3ceb4171e7c3" containerName="manager" Apr 20 19:31:07.136522 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.136523 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7281bc21-19ab-4e21-a6e7-3ceb4171e7c3" containerName="manager" Apr 20 19:31:07.136700 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.136655 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7281bc21-19ab-4e21-a6e7-3ceb4171e7c3" containerName="manager" Apr 20 19:31:07.141743 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.141711 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.143928 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.143905 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-mxpkt\"" Apr 20 19:31:07.150049 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.149981 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl"] Apr 20 19:31:07.267329 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267294 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.267525 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.267525 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.267525 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267427 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.267525 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267461 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lr6k\" (UniqueName: \"kubernetes.io/projected/48fb4579-df4a-4fbf-9df3-2833945580ae-kube-api-access-4lr6k\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.267525 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267505 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.267745 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267528 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.267745 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267613 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/48fb4579-df4a-4fbf-9df3-2833945580ae-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.267745 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.267637 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368355 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368319 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368408 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/48fb4579-df4a-4fbf-9df3-2833945580ae-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368434 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368460 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368520 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368521 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368813 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368542 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368813 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368565 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lr6k\" (UniqueName: \"kubernetes.io/projected/48fb4579-df4a-4fbf-9df3-2833945580ae-kube-api-access-4lr6k\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368813 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368602 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.368996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.368978 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.369080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.369026 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.369080 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.369069 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.369237 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.369105 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.369764 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.369741 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/48fb4579-df4a-4fbf-9df3-2833945580ae-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.370999 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.370980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.371288 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.371273 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.377649 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.377617 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lr6k\" (UniqueName: \"kubernetes.io/projected/48fb4579-df4a-4fbf-9df3-2833945580ae-kube-api-access-4lr6k\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.377789 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.377753 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/48fb4579-df4a-4fbf-9df3-2833945580ae-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cqwtl\" (UID: \"48fb4579-df4a-4fbf-9df3-2833945580ae\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.455524 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.455435 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:07.593376 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.593345 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl"] Apr 20 19:31:07.595224 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:31:07.595199 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fb4579_df4a_4fbf_9df3_2833945580ae.slice/crio-f34a7cabaf3a7a21fecfcac49304463de83fa51d2cea55d3dc0a172b1f8dd925 WatchSource:0}: Error finding container f34a7cabaf3a7a21fecfcac49304463de83fa51d2cea55d3dc0a172b1f8dd925: Status 404 returned error can't find the container with id f34a7cabaf3a7a21fecfcac49304463de83fa51d2cea55d3dc0a172b1f8dd925 Apr 20 19:31:07.600967 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.600926 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 19:31:07.601052 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.601019 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 19:31:07.601098 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.601057 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 19:31:07.882779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.882736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" event={"ID":"48fb4579-df4a-4fbf-9df3-2833945580ae","Type":"ContainerStarted","Data":"8ecc8e7154389c83f84a787f899bbc8e9311ad2e66afdb9e642bff021875c7ba"} Apr 20 19:31:07.882779 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.882783 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" event={"ID":"48fb4579-df4a-4fbf-9df3-2833945580ae","Type":"ContainerStarted","Data":"f34a7cabaf3a7a21fecfcac49304463de83fa51d2cea55d3dc0a172b1f8dd925"} Apr 20 19:31:07.908264 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:07.908199 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" podStartSLOduration=0.908183236 podStartE2EDuration="908.183236ms" podCreationTimestamp="2026-04-20 19:31:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:31:07.904407224 +0000 UTC m=+635.059957055" watchObservedRunningTime="2026-04-20 19:31:07.908183236 +0000 UTC m=+635.063733054" Apr 20 19:31:08.456368 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:08.456324 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:08.461351 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:08.461327 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:08.887476 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:08.887445 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:08.888476 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:08.888451 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cqwtl" Apr 20 19:31:20.206783 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.206740 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:31:20.210900 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.210876 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:20.213592 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.213132 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-j7x8z\"" Apr 20 19:31:20.213592 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.213573 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 19:31:20.216480 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.216455 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:31:20.239377 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.239347 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:31:20.287398 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.287367 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3c46ebc3-6578-4228-b8d4-8cb3c18b5038-config-file\") pod \"limitador-limitador-78c99df468-ql8jb\" (UID: \"3c46ebc3-6578-4228-b8d4-8cb3c18b5038\") " pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:20.287578 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.287434 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjxf4\" (UniqueName: \"kubernetes.io/projected/3c46ebc3-6578-4228-b8d4-8cb3c18b5038-kube-api-access-hjxf4\") pod \"limitador-limitador-78c99df468-ql8jb\" (UID: \"3c46ebc3-6578-4228-b8d4-8cb3c18b5038\") " pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:20.388616 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.388577 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjxf4\" (UniqueName: \"kubernetes.io/projected/3c46ebc3-6578-4228-b8d4-8cb3c18b5038-kube-api-access-hjxf4\") pod \"limitador-limitador-78c99df468-ql8jb\" (UID: \"3c46ebc3-6578-4228-b8d4-8cb3c18b5038\") " pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:20.388814 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.388712 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3c46ebc3-6578-4228-b8d4-8cb3c18b5038-config-file\") pod \"limitador-limitador-78c99df468-ql8jb\" (UID: \"3c46ebc3-6578-4228-b8d4-8cb3c18b5038\") " pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:20.389360 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.389337 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3c46ebc3-6578-4228-b8d4-8cb3c18b5038-config-file\") pod \"limitador-limitador-78c99df468-ql8jb\" (UID: \"3c46ebc3-6578-4228-b8d4-8cb3c18b5038\") " pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:20.397106 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.397081 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjxf4\" (UniqueName: \"kubernetes.io/projected/3c46ebc3-6578-4228-b8d4-8cb3c18b5038-kube-api-access-hjxf4\") pod \"limitador-limitador-78c99df468-ql8jb\" (UID: \"3c46ebc3-6578-4228-b8d4-8cb3c18b5038\") " pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:20.523332 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.523306 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:20.654442 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.654402 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:31:20.655666 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:31:20.655640 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c46ebc3_6578_4228_b8d4_8cb3c18b5038.slice/crio-050d49c14d1c633dd0b4b03f40c4d0b093f71d1bc32eb48c3456ab0275ca2322 WatchSource:0}: Error finding container 050d49c14d1c633dd0b4b03f40c4d0b093f71d1bc32eb48c3456ab0275ca2322: Status 404 returned error can't find the container with id 050d49c14d1c633dd0b4b03f40c4d0b093f71d1bc32eb48c3456ab0275ca2322 Apr 20 19:31:20.657537 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.657519 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:31:20.831752 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.831675 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-8ts6z"] Apr 20 19:31:20.836587 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.836569 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-8ts6z" Apr 20 19:31:20.839036 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.839013 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-psz6s\"" Apr 20 19:31:20.841202 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.841174 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-8ts6z"] Apr 20 19:31:20.892889 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.892859 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-765l2\" (UniqueName: \"kubernetes.io/projected/bc631e67-af47-49c9-930e-f2198541c837-kube-api-access-765l2\") pod \"authorino-7498df8756-8ts6z\" (UID: \"bc631e67-af47-49c9-930e-f2198541c837\") " pod="kuadrant-system/authorino-7498df8756-8ts6z" Apr 20 19:31:20.938408 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.938375 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" event={"ID":"3c46ebc3-6578-4228-b8d4-8cb3c18b5038","Type":"ContainerStarted","Data":"050d49c14d1c633dd0b4b03f40c4d0b093f71d1bc32eb48c3456ab0275ca2322"} Apr 20 19:31:20.993527 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:20.993490 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-765l2\" (UniqueName: \"kubernetes.io/projected/bc631e67-af47-49c9-930e-f2198541c837-kube-api-access-765l2\") pod \"authorino-7498df8756-8ts6z\" (UID: \"bc631e67-af47-49c9-930e-f2198541c837\") " pod="kuadrant-system/authorino-7498df8756-8ts6z" Apr 20 19:31:21.002108 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:21.002072 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-765l2\" (UniqueName: \"kubernetes.io/projected/bc631e67-af47-49c9-930e-f2198541c837-kube-api-access-765l2\") pod \"authorino-7498df8756-8ts6z\" (UID: \"bc631e67-af47-49c9-930e-f2198541c837\") " pod="kuadrant-system/authorino-7498df8756-8ts6z" Apr 20 19:31:21.147720 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:21.147633 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-8ts6z" Apr 20 19:31:21.278170 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:21.278136 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-8ts6z"] Apr 20 19:31:21.279642 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:31:21.279606 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc631e67_af47_49c9_930e_f2198541c837.slice/crio-045f61c865ee9ef391496aee1eeade247756cec8fd04cd4867009d8902b3fde9 WatchSource:0}: Error finding container 045f61c865ee9ef391496aee1eeade247756cec8fd04cd4867009d8902b3fde9: Status 404 returned error can't find the container with id 045f61c865ee9ef391496aee1eeade247756cec8fd04cd4867009d8902b3fde9 Apr 20 19:31:21.946657 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:21.946569 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-8ts6z" event={"ID":"bc631e67-af47-49c9-930e-f2198541c837","Type":"ContainerStarted","Data":"045f61c865ee9ef391496aee1eeade247756cec8fd04cd4867009d8902b3fde9"} Apr 20 19:31:25.967825 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:25.967726 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-8ts6z" event={"ID":"bc631e67-af47-49c9-930e-f2198541c837","Type":"ContainerStarted","Data":"17a9ae20d6255d411d7b66dc851630cca4c314a47b3ebef853dcc8c3d85f9737"} Apr 20 19:31:25.969236 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:25.969207 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" event={"ID":"3c46ebc3-6578-4228-b8d4-8cb3c18b5038","Type":"ContainerStarted","Data":"816bf390d237ebe75d6b315cc16bb4b7b07083f90b6b5f00f7d674a22f9b821d"} Apr 20 19:31:25.969399 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:25.969360 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:25.983337 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:25.983290 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-8ts6z" podStartSLOduration=1.744627812 podStartE2EDuration="5.983276287s" podCreationTimestamp="2026-04-20 19:31:20 +0000 UTC" firstStartedPulling="2026-04-20 19:31:21.281062617 +0000 UTC m=+648.436612417" lastFinishedPulling="2026-04-20 19:31:25.519711095 +0000 UTC m=+652.675260892" observedRunningTime="2026-04-20 19:31:25.982173069 +0000 UTC m=+653.137722890" watchObservedRunningTime="2026-04-20 19:31:25.983276287 +0000 UTC m=+653.138826106" Apr 20 19:31:25.998412 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:25.998360 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" podStartSLOduration=1.136241928 podStartE2EDuration="5.998344518s" podCreationTimestamp="2026-04-20 19:31:20 +0000 UTC" firstStartedPulling="2026-04-20 19:31:20.657659002 +0000 UTC m=+647.813208803" lastFinishedPulling="2026-04-20 19:31:25.519761593 +0000 UTC m=+652.675311393" observedRunningTime="2026-04-20 19:31:25.996986023 +0000 UTC m=+653.152535844" watchObservedRunningTime="2026-04-20 19:31:25.998344518 +0000 UTC m=+653.153894337" Apr 20 19:31:36.975505 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:36.975476 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-ql8jb" Apr 20 19:31:55.487686 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.487649 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-xh6rp"] Apr 20 19:31:55.491221 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.491201 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" Apr 20 19:31:55.499161 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.499129 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-xh6rp"] Apr 20 19:31:55.610006 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.609965 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqf85\" (UniqueName: \"kubernetes.io/projected/10f524fa-0e46-451b-83c1-619119d74c06-kube-api-access-pqf85\") pod \"authorino-8b475cf9f-xh6rp\" (UID: \"10f524fa-0e46-451b-83c1-619119d74c06\") " pod="kuadrant-system/authorino-8b475cf9f-xh6rp" Apr 20 19:31:55.711506 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.711469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqf85\" (UniqueName: \"kubernetes.io/projected/10f524fa-0e46-451b-83c1-619119d74c06-kube-api-access-pqf85\") pod \"authorino-8b475cf9f-xh6rp\" (UID: \"10f524fa-0e46-451b-83c1-619119d74c06\") " pod="kuadrant-system/authorino-8b475cf9f-xh6rp" Apr 20 19:31:55.721967 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.721934 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqf85\" (UniqueName: \"kubernetes.io/projected/10f524fa-0e46-451b-83c1-619119d74c06-kube-api-access-pqf85\") pod \"authorino-8b475cf9f-xh6rp\" (UID: \"10f524fa-0e46-451b-83c1-619119d74c06\") " pod="kuadrant-system/authorino-8b475cf9f-xh6rp" Apr 20 19:31:55.751532 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.751466 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-xh6rp"] Apr 20 19:31:55.751805 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.751791 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" Apr 20 19:31:55.886932 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:55.886905 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-xh6rp"] Apr 20 19:31:55.889433 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:31:55.889403 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10f524fa_0e46_451b_83c1_619119d74c06.slice/crio-4fc71dddecf27308bbc3ffd22085e56a830efe273861f34ca30adf7db085e1e2 WatchSource:0}: Error finding container 4fc71dddecf27308bbc3ffd22085e56a830efe273861f34ca30adf7db085e1e2: Status 404 returned error can't find the container with id 4fc71dddecf27308bbc3ffd22085e56a830efe273861f34ca30adf7db085e1e2 Apr 20 19:31:56.092389 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:56.092352 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" event={"ID":"10f524fa-0e46-451b-83c1-619119d74c06","Type":"ContainerStarted","Data":"4fc71dddecf27308bbc3ffd22085e56a830efe273861f34ca30adf7db085e1e2"} Apr 20 19:31:57.097628 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.097582 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" event={"ID":"10f524fa-0e46-451b-83c1-619119d74c06","Type":"ContainerStarted","Data":"5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd"} Apr 20 19:31:57.098078 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.097635 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" podUID="10f524fa-0e46-451b-83c1-619119d74c06" containerName="authorino" containerID="cri-o://5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd" gracePeriod=30 Apr 20 19:31:57.112272 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.112199 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" podStartSLOduration=1.77579715 podStartE2EDuration="2.112179822s" podCreationTimestamp="2026-04-20 19:31:55 +0000 UTC" firstStartedPulling="2026-04-20 19:31:55.891897592 +0000 UTC m=+683.047447390" lastFinishedPulling="2026-04-20 19:31:56.228280253 +0000 UTC m=+683.383830062" observedRunningTime="2026-04-20 19:31:57.111242875 +0000 UTC m=+684.266792705" watchObservedRunningTime="2026-04-20 19:31:57.112179822 +0000 UTC m=+684.267729643" Apr 20 19:31:57.368644 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.368616 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" Apr 20 19:31:57.530574 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.530537 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqf85\" (UniqueName: \"kubernetes.io/projected/10f524fa-0e46-451b-83c1-619119d74c06-kube-api-access-pqf85\") pod \"10f524fa-0e46-451b-83c1-619119d74c06\" (UID: \"10f524fa-0e46-451b-83c1-619119d74c06\") " Apr 20 19:31:57.532738 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.532711 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10f524fa-0e46-451b-83c1-619119d74c06-kube-api-access-pqf85" (OuterVolumeSpecName: "kube-api-access-pqf85") pod "10f524fa-0e46-451b-83c1-619119d74c06" (UID: "10f524fa-0e46-451b-83c1-619119d74c06"). InnerVolumeSpecName "kube-api-access-pqf85". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:57.631722 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.631636 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pqf85\" (UniqueName: \"kubernetes.io/projected/10f524fa-0e46-451b-83c1-619119d74c06-kube-api-access-pqf85\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:31:57.977477 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.977383 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-8ts6z"] Apr 20 19:31:57.977691 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:57.977662 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-8ts6z" podUID="bc631e67-af47-49c9-930e-f2198541c837" containerName="authorino" containerID="cri-o://17a9ae20d6255d411d7b66dc851630cca4c314a47b3ebef853dcc8c3d85f9737" gracePeriod=30 Apr 20 19:31:58.103802 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.103765 2580 generic.go:358] "Generic (PLEG): container finished" podID="bc631e67-af47-49c9-930e-f2198541c837" containerID="17a9ae20d6255d411d7b66dc851630cca4c314a47b3ebef853dcc8c3d85f9737" exitCode=0 Apr 20 19:31:58.104227 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.103823 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-8ts6z" event={"ID":"bc631e67-af47-49c9-930e-f2198541c837","Type":"ContainerDied","Data":"17a9ae20d6255d411d7b66dc851630cca4c314a47b3ebef853dcc8c3d85f9737"} Apr 20 19:31:58.105189 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.105163 2580 generic.go:358] "Generic (PLEG): container finished" podID="10f524fa-0e46-451b-83c1-619119d74c06" containerID="5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd" exitCode=0 Apr 20 19:31:58.105305 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.105206 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" event={"ID":"10f524fa-0e46-451b-83c1-619119d74c06","Type":"ContainerDied","Data":"5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd"} Apr 20 19:31:58.105305 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.105238 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" event={"ID":"10f524fa-0e46-451b-83c1-619119d74c06","Type":"ContainerDied","Data":"4fc71dddecf27308bbc3ffd22085e56a830efe273861f34ca30adf7db085e1e2"} Apr 20 19:31:58.105305 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.105292 2580 scope.go:117] "RemoveContainer" containerID="5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd" Apr 20 19:31:58.105434 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.105293 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-xh6rp" Apr 20 19:31:58.115314 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.115296 2580 scope.go:117] "RemoveContainer" containerID="5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd" Apr 20 19:31:58.115590 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:31:58.115566 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd\": container with ID starting with 5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd not found: ID does not exist" containerID="5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd" Apr 20 19:31:58.115670 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.115602 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd"} err="failed to get container status \"5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd\": rpc error: code = NotFound desc = could not find container \"5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd\": container with ID starting with 5dd29aef81aa76bbe4d7e554d7b99061975e9a2f68b3f19bb65be9ee55ba50cd not found: ID does not exist" Apr 20 19:31:58.122010 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.121982 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-xh6rp"] Apr 20 19:31:58.124855 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.124830 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-xh6rp"] Apr 20 19:31:58.237952 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.237880 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-8ts6z" Apr 20 19:31:58.337590 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.337551 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-765l2\" (UniqueName: \"kubernetes.io/projected/bc631e67-af47-49c9-930e-f2198541c837-kube-api-access-765l2\") pod \"bc631e67-af47-49c9-930e-f2198541c837\" (UID: \"bc631e67-af47-49c9-930e-f2198541c837\") " Apr 20 19:31:58.339764 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.339743 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc631e67-af47-49c9-930e-f2198541c837-kube-api-access-765l2" (OuterVolumeSpecName: "kube-api-access-765l2") pod "bc631e67-af47-49c9-930e-f2198541c837" (UID: "bc631e67-af47-49c9-930e-f2198541c837"). InnerVolumeSpecName "kube-api-access-765l2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:58.438533 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:58.438495 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-765l2\" (UniqueName: \"kubernetes.io/projected/bc631e67-af47-49c9-930e-f2198541c837-kube-api-access-765l2\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:31:59.110638 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:59.110552 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-8ts6z" Apr 20 19:31:59.111107 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:59.110552 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-8ts6z" event={"ID":"bc631e67-af47-49c9-930e-f2198541c837","Type":"ContainerDied","Data":"045f61c865ee9ef391496aee1eeade247756cec8fd04cd4867009d8902b3fde9"} Apr 20 19:31:59.111107 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:59.110688 2580 scope.go:117] "RemoveContainer" containerID="17a9ae20d6255d411d7b66dc851630cca4c314a47b3ebef853dcc8c3d85f9737" Apr 20 19:31:59.132713 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:59.132679 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-8ts6z"] Apr 20 19:31:59.136067 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:59.136042 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-8ts6z"] Apr 20 19:31:59.553833 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:59.553796 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10f524fa-0e46-451b-83c1-619119d74c06" path="/var/lib/kubelet/pods/10f524fa-0e46-451b-83c1-619119d74c06/volumes" Apr 20 19:31:59.554129 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:31:59.554116 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc631e67-af47-49c9-930e-f2198541c837" path="/var/lib/kubelet/pods/bc631e67-af47-49c9-930e-f2198541c837/volumes" Apr 20 19:32:05.483547 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:32:05.483499 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:32:49.655002 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:32:49.654967 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:32:50.858205 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:32:50.858167 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:33:00.265898 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.265859 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8"] Apr 20 19:33:00.266523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.266314 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10f524fa-0e46-451b-83c1-619119d74c06" containerName="authorino" Apr 20 19:33:00.266523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.266328 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="10f524fa-0e46-451b-83c1-619119d74c06" containerName="authorino" Apr 20 19:33:00.266523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.266352 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc631e67-af47-49c9-930e-f2198541c837" containerName="authorino" Apr 20 19:33:00.266523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.266361 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc631e67-af47-49c9-930e-f2198541c837" containerName="authorino" Apr 20 19:33:00.266523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.266432 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="10f524fa-0e46-451b-83c1-619119d74c06" containerName="authorino" Apr 20 19:33:00.266523 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.266439 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc631e67-af47-49c9-930e-f2198541c837" containerName="authorino" Apr 20 19:33:00.269338 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.269317 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.271803 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.271775 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 19:33:00.272592 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.272524 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 19:33:00.272592 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.272537 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-9dkrq\"" Apr 20 19:33:00.272791 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.272526 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 19:33:00.281433 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.281405 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8"] Apr 20 19:33:00.394195 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.394164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.394391 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.394215 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.394391 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.394242 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.394391 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.394287 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5gth\" (UniqueName: \"kubernetes.io/projected/aade7fe2-9b25-42bb-8645-888ef558ba10-kube-api-access-f5gth\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.394391 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.394319 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.394391 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.394380 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aade7fe2-9b25-42bb-8645-888ef558ba10-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.495242 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.495200 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.495464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.495292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.495464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.495326 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.495464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.495353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5gth\" (UniqueName: \"kubernetes.io/projected/aade7fe2-9b25-42bb-8645-888ef558ba10-kube-api-access-f5gth\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.495464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.495389 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.495464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.495448 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aade7fe2-9b25-42bb-8645-888ef558ba10-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.495742 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.495721 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.495850 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.495826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.496360 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.496278 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.497767 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.497742 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aade7fe2-9b25-42bb-8645-888ef558ba10-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.498139 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.498119 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aade7fe2-9b25-42bb-8645-888ef558ba10-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.502700 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.502678 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5gth\" (UniqueName: \"kubernetes.io/projected/aade7fe2-9b25-42bb-8645-888ef558ba10-kube-api-access-f5gth\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8\" (UID: \"aade7fe2-9b25-42bb-8645-888ef558ba10\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.580974 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.580868 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:00.718996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:00.718969 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8"] Apr 20 19:33:00.720416 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:33:00.720387 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaade7fe2_9b25_42bb_8645_888ef558ba10.slice/crio-f3164a1cadd89893c94321033a447920df1fa12468302748bffdbe1208a89d26 WatchSource:0}: Error finding container f3164a1cadd89893c94321033a447920df1fa12468302748bffdbe1208a89d26: Status 404 returned error can't find the container with id f3164a1cadd89893c94321033a447920df1fa12468302748bffdbe1208a89d26 Apr 20 19:33:01.240817 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:01.240782 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:33:01.381450 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:01.381416 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" event={"ID":"aade7fe2-9b25-42bb-8645-888ef558ba10","Type":"ContainerStarted","Data":"f3164a1cadd89893c94321033a447920df1fa12468302748bffdbe1208a89d26"} Apr 20 19:33:05.539391 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:05.539341 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:33:07.412510 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:07.412470 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" event={"ID":"aade7fe2-9b25-42bb-8645-888ef558ba10","Type":"ContainerStarted","Data":"f496d328a5ad7881678aa02df7f22be66f8ae0cd85c1929e64f7de0d387a3458"} Apr 20 19:33:13.446739 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:13.446689 2580 generic.go:358] "Generic (PLEG): container finished" podID="aade7fe2-9b25-42bb-8645-888ef558ba10" containerID="f496d328a5ad7881678aa02df7f22be66f8ae0cd85c1929e64f7de0d387a3458" exitCode=0 Apr 20 19:33:13.447123 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:13.446763 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" event={"ID":"aade7fe2-9b25-42bb-8645-888ef558ba10","Type":"ContainerDied","Data":"f496d328a5ad7881678aa02df7f22be66f8ae0cd85c1929e64f7de0d387a3458"} Apr 20 19:33:15.456074 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:15.456030 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" event={"ID":"aade7fe2-9b25-42bb-8645-888ef558ba10","Type":"ContainerStarted","Data":"220f04924be1dfa58250009c14afdf0de2420cdf93e22b5b083506528bf9e97d"} Apr 20 19:33:15.456539 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:15.456271 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:33:15.475077 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:15.475037 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" podStartSLOduration=1.667338462 podStartE2EDuration="15.475023142s" podCreationTimestamp="2026-04-20 19:33:00 +0000 UTC" firstStartedPulling="2026-04-20 19:33:00.722575581 +0000 UTC m=+747.878125381" lastFinishedPulling="2026-04-20 19:33:14.53026025 +0000 UTC m=+761.685810061" observedRunningTime="2026-04-20 19:33:15.473418537 +0000 UTC m=+762.628968378" watchObservedRunningTime="2026-04-20 19:33:15.475023142 +0000 UTC m=+762.630572961" Apr 20 19:33:21.245849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:21.245811 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:33:26.473512 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:33:26.473478 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8" Apr 20 19:34:11.146511 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:11.146478 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:34:37.627243 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.627202 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-9b67d995-595th"] Apr 20 19:34:37.629774 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.629757 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-9b67d995-595th" Apr 20 19:34:37.632613 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.632593 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-psz6s\"" Apr 20 19:34:37.632736 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.632616 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 19:34:37.637557 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.637206 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-9b67d995-595th"] Apr 20 19:34:37.729553 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.729513 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9djf\" (UniqueName: \"kubernetes.io/projected/1e48fc3f-d2e7-4523-acfb-70928565fd03-kube-api-access-l9djf\") pod \"authorino-9b67d995-595th\" (UID: \"1e48fc3f-d2e7-4523-acfb-70928565fd03\") " pod="kuadrant-system/authorino-9b67d995-595th" Apr 20 19:34:37.729751 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.729666 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1e48fc3f-d2e7-4523-acfb-70928565fd03-tls-cert\") pod \"authorino-9b67d995-595th\" (UID: \"1e48fc3f-d2e7-4523-acfb-70928565fd03\") " pod="kuadrant-system/authorino-9b67d995-595th" Apr 20 19:34:37.831054 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.831011 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9djf\" (UniqueName: \"kubernetes.io/projected/1e48fc3f-d2e7-4523-acfb-70928565fd03-kube-api-access-l9djf\") pod \"authorino-9b67d995-595th\" (UID: \"1e48fc3f-d2e7-4523-acfb-70928565fd03\") " pod="kuadrant-system/authorino-9b67d995-595th" Apr 20 19:34:37.831363 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.831135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1e48fc3f-d2e7-4523-acfb-70928565fd03-tls-cert\") pod \"authorino-9b67d995-595th\" (UID: \"1e48fc3f-d2e7-4523-acfb-70928565fd03\") " pod="kuadrant-system/authorino-9b67d995-595th" Apr 20 19:34:37.833831 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.833805 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1e48fc3f-d2e7-4523-acfb-70928565fd03-tls-cert\") pod \"authorino-9b67d995-595th\" (UID: \"1e48fc3f-d2e7-4523-acfb-70928565fd03\") " pod="kuadrant-system/authorino-9b67d995-595th" Apr 20 19:34:37.839112 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.839085 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9djf\" (UniqueName: \"kubernetes.io/projected/1e48fc3f-d2e7-4523-acfb-70928565fd03-kube-api-access-l9djf\") pod \"authorino-9b67d995-595th\" (UID: \"1e48fc3f-d2e7-4523-acfb-70928565fd03\") " pod="kuadrant-system/authorino-9b67d995-595th" Apr 20 19:34:37.940289 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:37.940185 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-9b67d995-595th" Apr 20 19:34:38.067550 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:38.067525 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-9b67d995-595th"] Apr 20 19:34:38.069719 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:34:38.069695 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e48fc3f_d2e7_4523_acfb_70928565fd03.slice/crio-a77b97ddc1f1a25a74a5838397c34212f53f5c0fea3ef4cf97ea2f5352b4f832 WatchSource:0}: Error finding container a77b97ddc1f1a25a74a5838397c34212f53f5c0fea3ef4cf97ea2f5352b4f832: Status 404 returned error can't find the container with id a77b97ddc1f1a25a74a5838397c34212f53f5c0fea3ef4cf97ea2f5352b4f832 Apr 20 19:34:38.799331 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:38.799289 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-9b67d995-595th" event={"ID":"1e48fc3f-d2e7-4523-acfb-70928565fd03","Type":"ContainerStarted","Data":"469c687fe0ad867078c7888d3cc4206af0cdb6af2d764fa824b471951f0089cd"} Apr 20 19:34:38.799331 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:38.799335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-9b67d995-595th" event={"ID":"1e48fc3f-d2e7-4523-acfb-70928565fd03","Type":"ContainerStarted","Data":"a77b97ddc1f1a25a74a5838397c34212f53f5c0fea3ef4cf97ea2f5352b4f832"} Apr 20 19:34:38.816285 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:34:38.816218 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-9b67d995-595th" podStartSLOduration=1.477233024 podStartE2EDuration="1.816201629s" podCreationTimestamp="2026-04-20 19:34:37 +0000 UTC" firstStartedPulling="2026-04-20 19:34:38.071020182 +0000 UTC m=+845.226569981" lastFinishedPulling="2026-04-20 19:34:38.409988772 +0000 UTC m=+845.565538586" observedRunningTime="2026-04-20 19:34:38.813668487 +0000 UTC m=+845.969218327" watchObservedRunningTime="2026-04-20 19:34:38.816201629 +0000 UTC m=+845.971751448" Apr 20 19:35:08.541075 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:08.541037 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:35:18.343713 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:18.343673 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:35:27.444726 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:27.444645 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:35:33.542353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:33.542313 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:35:33.543454 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:33.543431 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:35:33.549443 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:33.549411 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:35:33.550666 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:33.550644 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:35:38.641363 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:38.641332 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:35:46.345418 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:46.345380 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:35:57.443479 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:35:57.443430 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:36:59.337871 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:36:59.337790 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:37:14.642088 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:37:14.642050 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:37:52.837528 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:37:52.837482 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:38:09.939768 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:38:09.939729 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:38:24.645943 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:38:24.645903 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:38:41.237073 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:38:41.237035 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:39:35.144229 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:39:35.144193 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:39:44.944035 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:39:44.943998 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:40:01.045102 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:40:01.045004 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:40:09.941510 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:40:09.941451 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:40:27.338464 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:40:27.338416 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:40:33.576609 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:40:33.576580 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:40:33.579920 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:40:33.579900 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:40:33.583285 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:40:33.583267 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:40:33.586755 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:40:33.586737 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:40:35.144958 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:40:35.144923 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:41:07.357326 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:41:07.357287 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:41:15.537095 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:41:15.537053 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:41:25.036113 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:41:25.036037 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:41:33.349941 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:41:33.349888 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:41:41.742692 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:41:41.742651 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:41:49.532849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:41:49.532801 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:41:54.239023 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:41:54.238974 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:41:58.236564 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:41:58.236531 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:42:09.339590 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:42:09.339553 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:42:57.142304 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:42:57.142206 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:43:04.545572 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:43:04.545531 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:43:13.847808 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:43:13.847769 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:43:23.139917 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:43:23.139880 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:43:31.936873 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:43:31.936833 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:43:40.743159 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:43:40.743127 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:43:48.757720 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:43:48.757678 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:43:54.038132 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:43:54.038093 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:43:58.338787 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:43:58.338752 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:44:06.341043 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:44:06.340992 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:44:15.147960 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:44:15.147921 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:44:24.545731 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:44:24.545641 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:44:33.443674 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:44:33.443633 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:44:42.743952 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:44:42.743912 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:44:50.342218 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:44:50.342179 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:44:59.841102 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:44:59.841061 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:45:07.844690 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:45:07.844647 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:45:17.637123 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:45:17.637090 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:45:25.271085 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:45:25.271048 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:45:33.616604 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:45:33.616564 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:45:33.621833 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:45:33.621811 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:45:33.622938 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:45:33.622916 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:45:33.628059 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:45:33.628044 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:46:16.960383 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:16.960345 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr"] Apr 20 19:46:16.960912 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:16.960589 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" podUID="92a82465-6c14-4457-b8af-00b32e34335d" containerName="manager" containerID="cri-o://3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6" gracePeriod=10 Apr 20 19:46:17.721091 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.721069 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:46:17.728003 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.727976 2580 generic.go:358] "Generic (PLEG): container finished" podID="92a82465-6c14-4457-b8af-00b32e34335d" containerID="3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6" exitCode=0 Apr 20 19:46:17.728131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.728019 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" event={"ID":"92a82465-6c14-4457-b8af-00b32e34335d","Type":"ContainerDied","Data":"3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6"} Apr 20 19:46:17.728131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.728032 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" Apr 20 19:46:17.728131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.728055 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr" event={"ID":"92a82465-6c14-4457-b8af-00b32e34335d","Type":"ContainerDied","Data":"4c4d523bd5597316daae3008de92969635dd5548aea4e2c9ab78181131b412fd"} Apr 20 19:46:17.728131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.728076 2580 scope.go:117] "RemoveContainer" containerID="3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6" Apr 20 19:46:17.742783 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.740395 2580 scope.go:117] "RemoveContainer" containerID="3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6" Apr 20 19:46:17.742783 ip-10-0-134-118 kubenswrapper[2580]: E0420 19:46:17.742016 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6\": container with ID starting with 3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6 not found: ID does not exist" containerID="3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6" Apr 20 19:46:17.742783 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.742051 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6"} err="failed to get container status \"3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6\": rpc error: code = NotFound desc = could not find container \"3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6\": container with ID starting with 3d33d576fcbe538317d99e397289da3ff523c7654ea0e21e7593f6efdb0ec3f6 not found: ID does not exist" Apr 20 19:46:17.839021 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.838996 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92a82465-6c14-4457-b8af-00b32e34335d-extensions-socket-volume\") pod \"92a82465-6c14-4457-b8af-00b32e34335d\" (UID: \"92a82465-6c14-4457-b8af-00b32e34335d\") " Apr 20 19:46:17.839188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.839041 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjv7v\" (UniqueName: \"kubernetes.io/projected/92a82465-6c14-4457-b8af-00b32e34335d-kube-api-access-rjv7v\") pod \"92a82465-6c14-4457-b8af-00b32e34335d\" (UID: \"92a82465-6c14-4457-b8af-00b32e34335d\") " Apr 20 19:46:17.839418 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.839396 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92a82465-6c14-4457-b8af-00b32e34335d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "92a82465-6c14-4457-b8af-00b32e34335d" (UID: "92a82465-6c14-4457-b8af-00b32e34335d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:46:17.841211 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.841191 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a82465-6c14-4457-b8af-00b32e34335d-kube-api-access-rjv7v" (OuterVolumeSpecName: "kube-api-access-rjv7v") pod "92a82465-6c14-4457-b8af-00b32e34335d" (UID: "92a82465-6c14-4457-b8af-00b32e34335d"). InnerVolumeSpecName "kube-api-access-rjv7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:46:17.939741 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.939706 2580 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/92a82465-6c14-4457-b8af-00b32e34335d-extensions-socket-volume\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:46:17.939741 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:17.939740 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rjv7v\" (UniqueName: \"kubernetes.io/projected/92a82465-6c14-4457-b8af-00b32e34335d-kube-api-access-rjv7v\") on node \"ip-10-0-134-118.ec2.internal\" DevicePath \"\"" Apr 20 19:46:18.050279 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:18.050210 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr"] Apr 20 19:46:18.053370 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:18.053345 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nmfdr"] Apr 20 19:46:19.552745 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:46:19.552708 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a82465-6c14-4457-b8af-00b32e34335d" path="/var/lib/kubelet/pods/92a82465-6c14-4457-b8af-00b32e34335d/volumes" Apr 20 19:47:23.112006 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.111923 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4"] Apr 20 19:47:23.112458 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.112386 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92a82465-6c14-4457-b8af-00b32e34335d" containerName="manager" Apr 20 19:47:23.112458 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.112399 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a82465-6c14-4457-b8af-00b32e34335d" containerName="manager" Apr 20 19:47:23.112531 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.112477 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="92a82465-6c14-4457-b8af-00b32e34335d" containerName="manager" Apr 20 19:47:23.115597 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.115580 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:23.118131 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.118103 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-sxkjt\"" Apr 20 19:47:23.127663 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.127640 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4"] Apr 20 19:47:23.219372 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.219312 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zf6l\" (UniqueName: \"kubernetes.io/projected/d83a18a9-71c7-42e7-bfbe-a31a24d79be2-kube-api-access-2zf6l\") pod \"kuadrant-operator-controller-manager-55c7f4c975-hpfl4\" (UID: \"d83a18a9-71c7-42e7-bfbe-a31a24d79be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:23.219573 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.219536 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d83a18a9-71c7-42e7-bfbe-a31a24d79be2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-hpfl4\" (UID: \"d83a18a9-71c7-42e7-bfbe-a31a24d79be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:23.320043 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.319991 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d83a18a9-71c7-42e7-bfbe-a31a24d79be2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-hpfl4\" (UID: \"d83a18a9-71c7-42e7-bfbe-a31a24d79be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:23.320043 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.320050 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zf6l\" (UniqueName: \"kubernetes.io/projected/d83a18a9-71c7-42e7-bfbe-a31a24d79be2-kube-api-access-2zf6l\") pod \"kuadrant-operator-controller-manager-55c7f4c975-hpfl4\" (UID: \"d83a18a9-71c7-42e7-bfbe-a31a24d79be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:23.320465 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.320441 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d83a18a9-71c7-42e7-bfbe-a31a24d79be2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-hpfl4\" (UID: \"d83a18a9-71c7-42e7-bfbe-a31a24d79be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:23.330244 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.330224 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zf6l\" (UniqueName: \"kubernetes.io/projected/d83a18a9-71c7-42e7-bfbe-a31a24d79be2-kube-api-access-2zf6l\") pod \"kuadrant-operator-controller-manager-55c7f4c975-hpfl4\" (UID: \"d83a18a9-71c7-42e7-bfbe-a31a24d79be2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:23.427995 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.427909 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:23.561588 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.561565 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4"] Apr 20 19:47:23.563321 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:47:23.563289 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd83a18a9_71c7_42e7_bfbe_a31a24d79be2.slice/crio-fced2cc70a5a1590719a6351378cc105c0d03f79f34b4eff43da3c33b70ab844 WatchSource:0}: Error finding container fced2cc70a5a1590719a6351378cc105c0d03f79f34b4eff43da3c33b70ab844: Status 404 returned error can't find the container with id fced2cc70a5a1590719a6351378cc105c0d03f79f34b4eff43da3c33b70ab844 Apr 20 19:47:23.565476 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:23.565461 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:47:24.024316 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:24.024272 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" event={"ID":"d83a18a9-71c7-42e7-bfbe-a31a24d79be2","Type":"ContainerStarted","Data":"6e42073b160d107dfa79c83662015835bbaa347767c50c23f397e8adbaaf0c20"} Apr 20 19:47:24.024316 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:24.024312 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" event={"ID":"d83a18a9-71c7-42e7-bfbe-a31a24d79be2","Type":"ContainerStarted","Data":"fced2cc70a5a1590719a6351378cc105c0d03f79f34b4eff43da3c33b70ab844"} Apr 20 19:47:24.024569 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:24.024334 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:24.048807 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:24.048761 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" podStartSLOduration=1.048748029 podStartE2EDuration="1.048748029s" podCreationTimestamp="2026-04-20 19:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:47:24.04582533 +0000 UTC m=+1611.201375164" watchObservedRunningTime="2026-04-20 19:47:24.048748029 +0000 UTC m=+1611.204297847" Apr 20 19:47:35.031466 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:35.031433 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-hpfl4" Apr 20 19:47:43.849029 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:43.848992 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:47:49.443607 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:47:49.443570 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:48:14.343593 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:48:14.343550 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:48:21.540079 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:48:21.540030 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:48:31.132378 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:48:31.132347 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:48:41.640682 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:48:41.640633 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:48:49.433737 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:48:49.433691 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:49:01.034906 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:49:01.034814 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:49:09.039802 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:49:09.039768 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:49:20.537610 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:49:20.537580 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:49:28.537075 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:49:28.537036 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:49:38.237896 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:49:38.237858 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:49:48.305059 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:49:48.305020 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:50:23.545308 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:50:23.545202 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:50:33.656385 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:50:33.656350 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:50:33.662599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:50:33.662574 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:50:33.662738 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:50:33.662577 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:50:33.669701 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:50:33.669679 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:51:05.944349 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:51:05.944304 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:51:14.241980 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:51:14.241931 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:51:23.743564 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:51:23.743525 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:51:32.446393 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:51:32.446334 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:51:40.444226 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:51:40.444191 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:51:54.039800 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:51:54.039719 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:52:02.345348 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:52:02.345309 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:52:10.535172 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:52:10.535133 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:52:18.544956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:52:18.544923 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:52:27.040133 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:52:27.040096 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:52:35.832917 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:52:35.832879 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:52:46.444453 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:52:46.444402 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:53:03.261921 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:53:03.261882 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:53:11.537621 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:53:11.537583 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:53:20.543203 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:53:20.543122 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:53:29.144961 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:53:29.144925 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:53:46.737437 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:53:46.737398 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:53:55.155222 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:53:55.155180 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:54:04.438301 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:54:04.438262 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:54:12.543798 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:54:12.543758 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:54:20.546795 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:54:20.546751 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:54:29.249000 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:54:29.248963 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:54:38.953218 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:54:38.953181 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:54:49.148046 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:54:49.148014 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:55:00.042869 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:00.042787 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:55:09.736033 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:09.735992 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:55:19.145435 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:19.145401 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:55:28.139665 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:28.139623 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:55:33.693096 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:33.693065 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:55:33.700518 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:33.700492 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:55:33.701009 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:33.700985 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:55:33.707893 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:33.707871 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:55:35.942498 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:35.942457 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:55:44.338667 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:55:44.338624 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:56:00.041143 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:56:00.041103 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:56:08.539793 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:56:08.539757 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:56:18.145272 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:56:18.138726 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:56:25.738142 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:56:25.738051 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:56:50.538036 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:56:50.538004 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:57:02.041262 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:02.041223 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-ql8jb"] Apr 20 19:57:04.114388 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:04.114351 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-9b67d995-595th_1e48fc3f-d2e7-4523-acfb-70928565fd03/authorino/0.log" Apr 20 19:57:08.940940 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:08.940908 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7875d57869-w7znh_ff798f5c-331d-412b-8748-5adc81c3d101/manager/0.log" Apr 20 19:57:09.831747 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:09.831687 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js_85d30de4-f22c-4375-acb1-005a6556895f/extract/0.log" Apr 20 19:57:09.838604 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:09.838579 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js_85d30de4-f22c-4375-acb1-005a6556895f/util/0.log" Apr 20 19:57:09.845210 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:09.845188 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js_85d30de4-f22c-4375-acb1-005a6556895f/pull/0.log" Apr 20 19:57:09.962053 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:09.962023 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn_eb1ca523-7b5c-48c1-91cf-66352befe38a/util/0.log" Apr 20 19:57:09.969169 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:09.969148 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn_eb1ca523-7b5c-48c1-91cf-66352befe38a/pull/0.log" Apr 20 19:57:09.976654 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:09.976635 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn_eb1ca523-7b5c-48c1-91cf-66352befe38a/extract/0.log" Apr 20 19:57:10.094884 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.094803 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2_40afe44c-fd24-405c-9d93-1ddd9db818d2/util/0.log" Apr 20 19:57:10.101421 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.101400 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2_40afe44c-fd24-405c-9d93-1ddd9db818d2/pull/0.log" Apr 20 19:57:10.108101 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.108081 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2_40afe44c-fd24-405c-9d93-1ddd9db818d2/extract/0.log" Apr 20 19:57:10.221234 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.221205 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj_0f2dbce2-0997-48f9-b99e-7f49e643677c/util/0.log" Apr 20 19:57:10.227667 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.227647 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj_0f2dbce2-0997-48f9-b99e-7f49e643677c/pull/0.log" Apr 20 19:57:10.233704 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.233684 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj_0f2dbce2-0997-48f9-b99e-7f49e643677c/extract/0.log" Apr 20 19:57:10.353313 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.353224 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-9b67d995-595th_1e48fc3f-d2e7-4523-acfb-70928565fd03/authorino/0.log" Apr 20 19:57:10.847827 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.847795 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-nlc75_ee9a5352-9f0f-4dca-b13a-dd593b551f2c/registry-server/0.log" Apr 20 19:57:10.970794 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:10.970764 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-hpfl4_d83a18a9-71c7-42e7-bfbe-a31a24d79be2/manager/0.log" Apr 20 19:57:11.087588 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:11.087557 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-ql8jb_3c46ebc3-6578-4228-b8d4-8cb3c18b5038/limitador/0.log" Apr 20 19:57:11.565026 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:11.564998 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557ffqnwz_472cf0ba-cb87-451a-8cce-616a43e88e47/istio-proxy/0.log" Apr 20 19:57:12.038230 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:12.038182 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-cqwtl_48fb4579-df4a-4fbf-9df3-2833945580ae/istio-proxy/0.log" Apr 20 19:57:12.163835 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:12.163798 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76f79b8cc6-4kmk9_b2caeaea-f388-4a16-a139-404c07f66f1e/router/0.log" Apr 20 19:57:13.004101 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:13.004070 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8_aade7fe2-9b25-42bb-8645-888ef558ba10/storage-initializer/0.log" Apr 20 19:57:13.017556 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:13.017532 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-fhwh8_aade7fe2-9b25-42bb-8645-888ef558ba10/main/0.log" Apr 20 19:57:17.628243 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.628211 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5pdh/must-gather-gk7pc"] Apr 20 19:57:17.632138 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.632122 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5pdh/must-gather-gk7pc" Apr 20 19:57:17.634286 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.634235 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r5pdh\"/\"default-dockercfg-cwh22\"" Apr 20 19:57:17.634627 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.634605 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5pdh\"/\"kube-root-ca.crt\"" Apr 20 19:57:17.634749 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.634607 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5pdh\"/\"openshift-service-ca.crt\"" Apr 20 19:57:17.647922 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.647899 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5pdh/must-gather-gk7pc"] Apr 20 19:57:17.700738 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.700703 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x64lf\" (UniqueName: \"kubernetes.io/projected/33f01982-110f-4db6-93ea-d126a5e35f15-kube-api-access-x64lf\") pod \"must-gather-gk7pc\" (UID: \"33f01982-110f-4db6-93ea-d126a5e35f15\") " pod="openshift-must-gather-r5pdh/must-gather-gk7pc" Apr 20 19:57:17.700930 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.700791 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33f01982-110f-4db6-93ea-d126a5e35f15-must-gather-output\") pod \"must-gather-gk7pc\" (UID: \"33f01982-110f-4db6-93ea-d126a5e35f15\") " pod="openshift-must-gather-r5pdh/must-gather-gk7pc" Apr 20 19:57:17.801405 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.801374 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33f01982-110f-4db6-93ea-d126a5e35f15-must-gather-output\") pod \"must-gather-gk7pc\" (UID: \"33f01982-110f-4db6-93ea-d126a5e35f15\") " pod="openshift-must-gather-r5pdh/must-gather-gk7pc" Apr 20 19:57:17.801575 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.801469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x64lf\" (UniqueName: \"kubernetes.io/projected/33f01982-110f-4db6-93ea-d126a5e35f15-kube-api-access-x64lf\") pod \"must-gather-gk7pc\" (UID: \"33f01982-110f-4db6-93ea-d126a5e35f15\") " pod="openshift-must-gather-r5pdh/must-gather-gk7pc" Apr 20 19:57:17.801701 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.801682 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/33f01982-110f-4db6-93ea-d126a5e35f15-must-gather-output\") pod \"must-gather-gk7pc\" (UID: \"33f01982-110f-4db6-93ea-d126a5e35f15\") " pod="openshift-must-gather-r5pdh/must-gather-gk7pc" Apr 20 19:57:17.809293 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.809229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x64lf\" (UniqueName: \"kubernetes.io/projected/33f01982-110f-4db6-93ea-d126a5e35f15-kube-api-access-x64lf\") pod \"must-gather-gk7pc\" (UID: \"33f01982-110f-4db6-93ea-d126a5e35f15\") " pod="openshift-must-gather-r5pdh/must-gather-gk7pc" Apr 20 19:57:17.942471 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:17.942382 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5pdh/must-gather-gk7pc" Apr 20 19:57:18.082001 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:18.081970 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5pdh/must-gather-gk7pc"] Apr 20 19:57:18.083603 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:57:18.083579 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33f01982_110f_4db6_93ea_d126a5e35f15.slice/crio-47e85de780d361fcc633e5afad162672752a285e198b49a399e3fbdef6f4194d WatchSource:0}: Error finding container 47e85de780d361fcc633e5afad162672752a285e198b49a399e3fbdef6f4194d: Status 404 returned error can't find the container with id 47e85de780d361fcc633e5afad162672752a285e198b49a399e3fbdef6f4194d Apr 20 19:57:18.085482 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:18.085467 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:57:18.540038 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:18.539998 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/must-gather-gk7pc" event={"ID":"33f01982-110f-4db6-93ea-d126a5e35f15","Type":"ContainerStarted","Data":"47e85de780d361fcc633e5afad162672752a285e198b49a399e3fbdef6f4194d"} Apr 20 19:57:19.546551 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:19.546517 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/must-gather-gk7pc" event={"ID":"33f01982-110f-4db6-93ea-d126a5e35f15","Type":"ContainerStarted","Data":"ba83ae96d7f8fd69146072ec284269c2684d8ec823a51233312930326af940fe"} Apr 20 19:57:19.546551 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:19.546555 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/must-gather-gk7pc" event={"ID":"33f01982-110f-4db6-93ea-d126a5e35f15","Type":"ContainerStarted","Data":"08bb638704f042dc58105564536034b8dde9178b468dfedbab003cfe31f3eda5"} Apr 20 19:57:19.563326 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:19.563260 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5pdh/must-gather-gk7pc" podStartSLOduration=1.5729707 podStartE2EDuration="2.563229397s" podCreationTimestamp="2026-04-20 19:57:17 +0000 UTC" firstStartedPulling="2026-04-20 19:57:18.08558931 +0000 UTC m=+2205.241139110" lastFinishedPulling="2026-04-20 19:57:19.075848008 +0000 UTC m=+2206.231397807" observedRunningTime="2026-04-20 19:57:19.560697721 +0000 UTC m=+2206.716247541" watchObservedRunningTime="2026-04-20 19:57:19.563229397 +0000 UTC m=+2206.718779217" Apr 20 19:57:20.706437 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:20.706402 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-frnjg_1e64ae2b-e7d8-473c-9c0c-c27208349a5c/global-pull-secret-syncer/0.log" Apr 20 19:57:20.846518 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:20.846484 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-z6jmk_f817bf27-7a92-45a2-acd6-3f63fdbb765d/konnectivity-agent/0.log" Apr 20 19:57:20.904199 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:20.904170 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-118.ec2.internal_eb80aa0e1c4282d53926185e2c1c5fe9/haproxy/0.log" Apr 20 19:57:25.426404 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.426363 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js_85d30de4-f22c-4375-acb1-005a6556895f/extract/0.log" Apr 20 19:57:25.448391 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.448304 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js_85d30de4-f22c-4375-acb1-005a6556895f/util/0.log" Apr 20 19:57:25.468857 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.468828 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759458js_85d30de4-f22c-4375-acb1-005a6556895f/pull/0.log" Apr 20 19:57:25.496342 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.496307 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn_eb1ca523-7b5c-48c1-91cf-66352befe38a/extract/0.log" Apr 20 19:57:25.521981 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.521952 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn_eb1ca523-7b5c-48c1-91cf-66352befe38a/util/0.log" Apr 20 19:57:25.549044 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.549012 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0dwbvn_eb1ca523-7b5c-48c1-91cf-66352befe38a/pull/0.log" Apr 20 19:57:25.574913 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.574871 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2_40afe44c-fd24-405c-9d93-1ddd9db818d2/extract/0.log" Apr 20 19:57:25.600735 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.600701 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2_40afe44c-fd24-405c-9d93-1ddd9db818d2/util/0.log" Apr 20 19:57:25.619881 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.619827 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73cz9p2_40afe44c-fd24-405c-9d93-1ddd9db818d2/pull/0.log" Apr 20 19:57:25.646187 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.646156 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj_0f2dbce2-0997-48f9-b99e-7f49e643677c/extract/0.log" Apr 20 19:57:25.666491 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.666455 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj_0f2dbce2-0997-48f9-b99e-7f49e643677c/util/0.log" Apr 20 19:57:25.685996 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.685917 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1t44tj_0f2dbce2-0997-48f9-b99e-7f49e643677c/pull/0.log" Apr 20 19:57:25.953188 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:25.952161 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-9b67d995-595th_1e48fc3f-d2e7-4523-acfb-70928565fd03/authorino/0.log" Apr 20 19:57:26.075476 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:26.075440 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-nlc75_ee9a5352-9f0f-4dca-b13a-dd593b551f2c/registry-server/0.log" Apr 20 19:57:26.142705 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:26.142673 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-hpfl4_d83a18a9-71c7-42e7-bfbe-a31a24d79be2/manager/0.log" Apr 20 19:57:26.162025 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:26.162000 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-ql8jb_3c46ebc3-6578-4228-b8d4-8cb3c18b5038/limitador/0.log" Apr 20 19:57:27.794531 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:27.794492 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6fed37ca-f706-4cae-9747-83d43a58e7a7/alertmanager/0.log" Apr 20 19:57:27.814660 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:27.814626 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6fed37ca-f706-4cae-9747-83d43a58e7a7/config-reloader/0.log" Apr 20 19:57:27.834548 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:27.834521 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6fed37ca-f706-4cae-9747-83d43a58e7a7/kube-rbac-proxy-web/0.log" Apr 20 19:57:27.854549 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:27.854523 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6fed37ca-f706-4cae-9747-83d43a58e7a7/kube-rbac-proxy/0.log" Apr 20 19:57:27.873585 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:27.873557 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6fed37ca-f706-4cae-9747-83d43a58e7a7/kube-rbac-proxy-metric/0.log" Apr 20 19:57:27.896155 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:27.896124 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6fed37ca-f706-4cae-9747-83d43a58e7a7/prom-label-proxy/0.log" Apr 20 19:57:27.915053 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:27.914973 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_6fed37ca-f706-4cae-9747-83d43a58e7a7/init-config-reloader/0.log" Apr 20 19:57:27.991277 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:27.991227 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f8lth_ff99f872-5830-475e-b0ab-2019f63a53c2/kube-state-metrics/0.log" Apr 20 19:57:28.012370 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.012341 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f8lth_ff99f872-5830-475e-b0ab-2019f63a53c2/kube-rbac-proxy-main/0.log" Apr 20 19:57:28.031453 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.031417 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-f8lth_ff99f872-5830-475e-b0ab-2019f63a53c2/kube-rbac-proxy-self/0.log" Apr 20 19:57:28.176086 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.175999 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqt9s_d41e64ff-5632-4dbf-9594-58ad2cd1ccc5/node-exporter/0.log" Apr 20 19:57:28.193618 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.193557 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqt9s_d41e64ff-5632-4dbf-9594-58ad2cd1ccc5/kube-rbac-proxy/0.log" Apr 20 19:57:28.211647 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.211619 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nqt9s_d41e64ff-5632-4dbf-9594-58ad2cd1ccc5/init-textfile/0.log" Apr 20 19:57:28.291681 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.291647 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p8t2c_3f387372-bc50-4d3b-b439-725017d71a0f/kube-rbac-proxy-main/0.log" Apr 20 19:57:28.311016 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.310989 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p8t2c_3f387372-bc50-4d3b-b439-725017d71a0f/kube-rbac-proxy-self/0.log" Apr 20 19:57:28.330559 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.330518 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-p8t2c_3f387372-bc50-4d3b-b439-725017d71a0f/openshift-state-metrics/0.log" Apr 20 19:57:28.553432 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.553396 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-fdl24_fdde5ed3-13c4-4768-8526-e3485db975eb/prometheus-operator-admission-webhook/0.log" Apr 20 19:57:28.581897 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.581860 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79df64d499-d24hs_4adea75c-b9b0-4da9-a5b9-1458233cf095/telemeter-client/0.log" Apr 20 19:57:28.600085 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.600046 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79df64d499-d24hs_4adea75c-b9b0-4da9-a5b9-1458233cf095/reload/0.log" Apr 20 19:57:28.623161 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.623130 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-79df64d499-d24hs_4adea75c-b9b0-4da9-a5b9-1458233cf095/kube-rbac-proxy/0.log" Apr 20 19:57:28.653749 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.653720 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/thanos-query/0.log" Apr 20 19:57:28.671217 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.671193 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/kube-rbac-proxy-web/0.log" Apr 20 19:57:28.688813 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.688786 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/kube-rbac-proxy/0.log" Apr 20 19:57:28.711971 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.711941 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/prom-label-proxy/0.log" Apr 20 19:57:28.740119 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.740083 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/kube-rbac-proxy-rules/0.log" Apr 20 19:57:28.754025 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:28.753997 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c4bcd9dc5-m4dj4_dd66325b-c9cf-4338-988a-2be2df804b9b/kube-rbac-proxy-metrics/0.log" Apr 20 19:57:29.569709 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.569671 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g"] Apr 20 19:57:29.576809 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.576779 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.581943 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.581902 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g"] Apr 20 19:57:29.731690 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.731644 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-lib-modules\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.731897 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.731709 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-sys\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.731897 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.731744 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m27q9\" (UniqueName: \"kubernetes.io/projected/4096d941-7a6e-4fc2-9919-ab375b71a332-kube-api-access-m27q9\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.731897 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.731801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-proc\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.731897 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.731866 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-podres\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.800300 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.800271 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-76ngm_3fa1e505-222b-4d26-b6c6-b500bff9d597/networking-console-plugin/0.log" Apr 20 19:57:29.832599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.832501 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-lib-modules\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.832599 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.832568 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-sys\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.832849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.832604 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m27q9\" (UniqueName: \"kubernetes.io/projected/4096d941-7a6e-4fc2-9919-ab375b71a332-kube-api-access-m27q9\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.832849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.832686 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-proc\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.832849 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.832743 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-podres\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.833020 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.832996 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-podres\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.833123 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.833100 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-lib-modules\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.833195 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.833161 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-sys\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.833605 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.833580 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4096d941-7a6e-4fc2-9919-ab375b71a332-proc\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.841381 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.841350 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m27q9\" (UniqueName: \"kubernetes.io/projected/4096d941-7a6e-4fc2-9919-ab375b71a332-kube-api-access-m27q9\") pod \"perf-node-gather-daemonset-fck4g\" (UID: \"4096d941-7a6e-4fc2-9919-ab375b71a332\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:29.896353 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:29.896308 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:30.060060 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.060030 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g"] Apr 20 19:57:30.061130 ip-10-0-134-118 kubenswrapper[2580]: W0420 19:57:30.061102 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4096d941_7a6e_4fc2_9919_ab375b71a332.slice/crio-a199e3609f6850a8dc3d0b53cb20776930946c25e6e322df28028739f9b19556 WatchSource:0}: Error finding container a199e3609f6850a8dc3d0b53cb20776930946c25e6e322df28028739f9b19556: Status 404 returned error can't find the container with id a199e3609f6850a8dc3d0b53cb20776930946c25e6e322df28028739f9b19556 Apr 20 19:57:30.344521 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.344434 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/2.log" Apr 20 19:57:30.349643 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.349620 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xl6pd_c94687d9-038f-401c-95c1-1a65b578340b/console-operator/3.log" Apr 20 19:57:30.656494 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.656395 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" event={"ID":"4096d941-7a6e-4fc2-9919-ab375b71a332","Type":"ContainerStarted","Data":"ed45907348ee0c2d6026a9bb572f16e5d97e7cc42cff2bbcc722d407aa76f9a5"} Apr 20 19:57:30.656494 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.656444 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" event={"ID":"4096d941-7a6e-4fc2-9919-ab375b71a332","Type":"ContainerStarted","Data":"a199e3609f6850a8dc3d0b53cb20776930946c25e6e322df28028739f9b19556"} Apr 20 19:57:30.657022 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.656525 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:30.671926 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.671882 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" podStartSLOduration=1.671867049 podStartE2EDuration="1.671867049s" podCreationTimestamp="2026-04-20 19:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:57:30.669518553 +0000 UTC m=+2217.825068384" watchObservedRunningTime="2026-04-20 19:57:30.671867049 +0000 UTC m=+2217.827416868" Apr 20 19:57:30.815665 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.815630 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d8566fcc-8d2rq_549fb4c6-a797-44cc-ad04-c044daa57e7f/console/0.log" Apr 20 19:57:30.865271 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:30.865229 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-gctks_dfd03727-181d-4602-92a1-1407031aec92/download-server/0.log" Apr 20 19:57:31.318200 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:31.318171 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-zb2mk_92b63e56-5288-415d-a75a-4c18d45ecf72/volume-data-source-validator/0.log" Apr 20 19:57:32.016036 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:32.016008 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h2wph_593cc9e9-b499-4686-897a-e1b604685e20/dns/0.log" Apr 20 19:57:32.034954 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:32.034933 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-h2wph_593cc9e9-b499-4686-897a-e1b604685e20/kube-rbac-proxy/0.log" Apr 20 19:57:32.164835 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:32.164811 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-78xk2_ab0118c6-b394-43bc-bf6d-fb41c2beb9d1/dns-node-resolver/0.log" Apr 20 19:57:32.666956 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:32.666929 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fkcml_43ca0838-f833-4608-bf4f-d6f498c3c609/node-ca/0.log" Apr 20 19:57:33.498983 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:33.498954 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557ffqnwz_472cf0ba-cb87-451a-8cce-616a43e88e47/istio-proxy/0.log" Apr 20 19:57:33.739430 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:33.739395 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-cqwtl_48fb4579-df4a-4fbf-9df3-2833945580ae/istio-proxy/0.log" Apr 20 19:57:33.770908 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:33.770883 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-76f79b8cc6-4kmk9_b2caeaea-f388-4a16-a139-404c07f66f1e/router/0.log" Apr 20 19:57:34.348638 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:34.348604 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8q97z_b7b97f68-52bc-4ef7-81c1-cbf2a7da14a2/serve-healthcheck-canary/0.log" Apr 20 19:57:34.872990 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:34.872961 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6xz5p_352db113-3b67-4009-a8e0-926d28c4af22/kube-rbac-proxy/0.log" Apr 20 19:57:34.890662 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:34.890632 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6xz5p_352db113-3b67-4009-a8e0-926d28c4af22/exporter/0.log" Apr 20 19:57:34.909378 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:34.909349 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6xz5p_352db113-3b67-4009-a8e0-926d28c4af22/extractor/0.log" Apr 20 19:57:36.671877 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:36.671845 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-fck4g" Apr 20 19:57:37.088486 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:37.088445 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7875d57869-w7znh_ff798f5c-331d-412b-8748-5adc81c3d101/manager/0.log" Apr 20 19:57:38.356910 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:38.356821 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5c6db948fd-794jn_a2827786-8b73-49ef-92e2-8988ac55b679/manager/0.log" Apr 20 19:57:44.542245 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.542217 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l87ws_58118261-be4f-4f34-96ae-d918e3128ec4/kube-multus-additional-cni-plugins/0.log" Apr 20 19:57:44.562169 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.562136 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l87ws_58118261-be4f-4f34-96ae-d918e3128ec4/egress-router-binary-copy/0.log" Apr 20 19:57:44.582113 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.582087 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l87ws_58118261-be4f-4f34-96ae-d918e3128ec4/cni-plugins/0.log" Apr 20 19:57:44.602659 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.602632 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l87ws_58118261-be4f-4f34-96ae-d918e3128ec4/bond-cni-plugin/0.log" Apr 20 19:57:44.620606 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.620578 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l87ws_58118261-be4f-4f34-96ae-d918e3128ec4/routeoverride-cni/0.log" Apr 20 19:57:44.640290 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.640239 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l87ws_58118261-be4f-4f34-96ae-d918e3128ec4/whereabouts-cni-bincopy/0.log" Apr 20 19:57:44.658113 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.658082 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-l87ws_58118261-be4f-4f34-96ae-d918e3128ec4/whereabouts-cni/0.log" Apr 20 19:57:44.692903 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.692872 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bcggv_cb98fcc4-ee47-45ed-bae3-05703748d0df/kube-multus/0.log" Apr 20 19:57:44.737920 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.737892 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mw5qh_a8ada6b3-5038-4d1c-bbe5-a9626c8c1987/network-metrics-daemon/0.log" Apr 20 19:57:44.753945 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:44.753921 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mw5qh_a8ada6b3-5038-4d1c-bbe5-a9626c8c1987/kube-rbac-proxy/0.log" Apr 20 19:57:45.671113 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.671082 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-controller/0.log" Apr 20 19:57:45.685511 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.685442 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/0.log" Apr 20 19:57:45.698913 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.698886 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovn-acl-logging/1.log" Apr 20 19:57:45.716129 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.716100 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/kube-rbac-proxy-node/0.log" Apr 20 19:57:45.733876 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.733851 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 19:57:45.747937 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.747916 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/northd/0.log" Apr 20 19:57:45.765516 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.765491 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/nbdb/0.log" Apr 20 19:57:45.782565 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.782546 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/sbdb/0.log" Apr 20 19:57:45.919797 ip-10-0-134-118 kubenswrapper[2580]: I0420 19:57:45.919747 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d9tnf_9ab31a48-0dcf-4d61-94cd-b05c680c9b49/ovnkube-controller/0.log"