Apr 17 21:11:23.726144 ip-10-0-134-198 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 21:11:23.726155 ip-10-0-134-198 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 21:11:23.726163 ip-10-0-134-198 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 21:11:23.726368 ip-10-0-134-198 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 21:11:33.856392 ip-10-0-134-198 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 21:11:33.856410 ip-10-0-134-198 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 32a79b0692d447d8a5c89fcb30139752 -- Apr 17 21:14:04.302800 ip-10-0-134-198 systemd[1]: Starting Kubernetes Kubelet... Apr 17 21:14:04.793342 ip-10-0-134-198 kubenswrapper[2567]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:14:04.793342 ip-10-0-134-198 kubenswrapper[2567]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 21:14:04.793342 ip-10-0-134-198 kubenswrapper[2567]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:14:04.793342 ip-10-0-134-198 kubenswrapper[2567]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 21:14:04.793342 ip-10-0-134-198 kubenswrapper[2567]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:14:04.794497 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.794078 2567 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 21:14:04.800389 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800369 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:14:04.800389 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800388 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:14:04.800389 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800392 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800395 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800398 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800403 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800406 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800409 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800412 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800414 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800418 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800421 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800427 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800430 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800433 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800437 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800441 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800443 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800446 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800448 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800451 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:14:04.800480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800455 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800459 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800462 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800465 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800467 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800470 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800473 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800475 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800478 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800480 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800483 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800486 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800489 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800491 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800494 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800497 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800500 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800503 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800505 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800508 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:14:04.800987 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800511 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800513 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800528 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800532 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800536 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800541 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800546 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800548 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800551 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800553 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800556 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800558 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800561 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800564 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800567 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800571 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800573 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800576 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800579 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:14:04.801480 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800581 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800584 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800587 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800589 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800592 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800594 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800597 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800600 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800602 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800605 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800608 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800611 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800614 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800616 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800619 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800622 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800625 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800627 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800630 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800633 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:14:04.801959 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800635 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800638 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800640 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800643 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800651 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.800653 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801064 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801069 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801073 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801075 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801078 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801081 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801084 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801087 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801090 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801092 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801095 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801097 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801100 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801103 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:14:04.802438 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801105 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801108 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801110 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801113 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801116 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801120 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801123 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801126 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801128 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801131 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801134 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801136 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801139 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801141 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801144 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801147 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801150 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801152 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801155 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:14:04.802936 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801160 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801162 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801165 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801168 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801170 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801173 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801175 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801178 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801181 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801183 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801186 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801188 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801191 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801193 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801196 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801198 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801201 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801204 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801206 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801209 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:14:04.803443 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801212 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801214 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801217 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801219 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801222 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801224 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801227 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801229 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801232 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801234 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801236 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801240 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801243 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801245 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801248 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801251 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801254 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801256 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801267 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801270 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:14:04.803948 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801274 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801278 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801281 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801285 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801287 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801290 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801293 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801295 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801298 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801300 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801304 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801307 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.801310 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802953 2567 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802963 2567 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802969 2567 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802974 2567 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802979 2567 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802982 2567 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802987 2567 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802992 2567 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 21:14:04.804434 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802996 2567 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.802999 2567 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803003 2567 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803008 2567 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803011 2567 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803014 2567 flags.go:64] FLAG: --cgroup-root="" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803017 2567 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803021 2567 flags.go:64] FLAG: --client-ca-file="" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803024 2567 flags.go:64] FLAG: --cloud-config="" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803027 2567 flags.go:64] FLAG: --cloud-provider="external" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803030 2567 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803035 2567 flags.go:64] FLAG: --cluster-domain="" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803038 2567 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803042 2567 flags.go:64] FLAG: --config-dir="" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803044 2567 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803048 2567 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803053 2567 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803056 2567 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803060 2567 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803063 2567 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803067 2567 flags.go:64] FLAG: --contention-profiling="false" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803070 2567 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803073 2567 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803077 2567 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803080 2567 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 21:14:04.804953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803084 2567 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803088 2567 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803091 2567 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803094 2567 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803097 2567 flags.go:64] FLAG: --enable-server="true" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803100 2567 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803104 2567 flags.go:64] FLAG: --event-burst="100" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803107 2567 flags.go:64] FLAG: --event-qps="50" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803110 2567 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803114 2567 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803118 2567 flags.go:64] FLAG: --eviction-hard="" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803122 2567 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803125 2567 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803128 2567 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803132 2567 flags.go:64] FLAG: --eviction-soft="" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803135 2567 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803138 2567 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803141 2567 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803144 2567 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803147 2567 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803150 2567 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803153 2567 flags.go:64] FLAG: --feature-gates="" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803157 2567 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803160 2567 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803163 2567 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 21:14:04.805578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803166 2567 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803169 2567 flags.go:64] FLAG: --healthz-port="10248" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803173 2567 flags.go:64] FLAG: --help="false" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803175 2567 flags.go:64] FLAG: --hostname-override="ip-10-0-134-198.ec2.internal" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803179 2567 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803182 2567 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803185 2567 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803189 2567 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803192 2567 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803195 2567 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803198 2567 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803201 2567 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803204 2567 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803207 2567 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803210 2567 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803213 2567 flags.go:64] FLAG: --kube-reserved="" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803216 2567 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803220 2567 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803224 2567 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803226 2567 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803229 2567 flags.go:64] FLAG: --lock-file="" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803232 2567 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803235 2567 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803239 2567 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 21:14:04.806179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803244 2567 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803247 2567 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803250 2567 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803253 2567 flags.go:64] FLAG: --logging-format="text" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803256 2567 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803259 2567 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803263 2567 flags.go:64] FLAG: --manifest-url="" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803265 2567 flags.go:64] FLAG: --manifest-url-header="" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803270 2567 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803273 2567 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803277 2567 flags.go:64] FLAG: --max-pods="110" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803280 2567 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803284 2567 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803287 2567 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803291 2567 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803294 2567 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803297 2567 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803300 2567 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803307 2567 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803311 2567 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803314 2567 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803317 2567 flags.go:64] FLAG: --pod-cidr="" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803320 2567 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 21:14:04.806772 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803327 2567 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803330 2567 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803334 2567 flags.go:64] FLAG: --pods-per-core="0" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803337 2567 flags.go:64] FLAG: --port="10250" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803340 2567 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803343 2567 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-045c675b8503b5928" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803346 2567 flags.go:64] FLAG: --qos-reserved="" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803349 2567 flags.go:64] FLAG: --read-only-port="10255" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803352 2567 flags.go:64] FLAG: --register-node="true" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803355 2567 flags.go:64] FLAG: --register-schedulable="true" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803358 2567 flags.go:64] FLAG: --register-with-taints="" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803362 2567 flags.go:64] FLAG: --registry-burst="10" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803365 2567 flags.go:64] FLAG: --registry-qps="5" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803367 2567 flags.go:64] FLAG: --reserved-cpus="" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803370 2567 flags.go:64] FLAG: --reserved-memory="" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803375 2567 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803378 2567 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803381 2567 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803384 2567 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803387 2567 flags.go:64] FLAG: --runonce="false" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803390 2567 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803393 2567 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803396 2567 flags.go:64] FLAG: --seccomp-default="false" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803398 2567 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803402 2567 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803405 2567 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 21:14:04.807324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803408 2567 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803411 2567 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803414 2567 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803417 2567 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803420 2567 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803423 2567 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803426 2567 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803429 2567 flags.go:64] FLAG: --system-cgroups="" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803432 2567 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803438 2567 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803441 2567 flags.go:64] FLAG: --tls-cert-file="" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803445 2567 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803450 2567 flags.go:64] FLAG: --tls-min-version="" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803453 2567 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803456 2567 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803460 2567 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803463 2567 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803466 2567 flags.go:64] FLAG: --v="2" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803470 2567 flags.go:64] FLAG: --version="false" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803474 2567 flags.go:64] FLAG: --vmodule="" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803479 2567 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.803483 2567 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803596 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803601 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:14:04.807968 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803605 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803608 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803611 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803613 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803616 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803619 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803622 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803625 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803627 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803630 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803632 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803635 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803637 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803640 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803643 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803646 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803648 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803651 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803653 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803656 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:14:04.808580 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803658 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803661 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803663 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803666 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803668 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803671 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803674 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803677 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803679 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803682 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803684 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803687 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803689 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803693 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803698 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803702 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803706 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803712 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803715 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:14:04.809107 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803718 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803721 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803724 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803727 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803730 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803732 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803735 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803738 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803741 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803744 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803747 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803750 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803753 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803755 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803758 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803760 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803763 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803765 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803768 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803770 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:14:04.809601 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803773 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803775 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803778 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803780 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803783 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803785 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803788 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803791 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803793 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803796 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803799 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803802 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803805 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803808 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803811 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803814 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803816 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803819 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803821 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803824 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:14:04.810089 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803830 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:14:04.810632 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803832 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:14:04.810632 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803835 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:14:04.810632 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803838 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:14:04.810632 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.803840 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:14:04.810632 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.804646 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:14:04.812472 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.812447 2567 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 21:14:04.812472 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.812467 2567 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812533 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812540 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812544 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812548 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812551 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812554 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812557 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812560 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812563 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812565 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812568 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812571 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812573 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812576 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812579 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812581 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812584 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812587 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812589 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:14:04.812616 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812592 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812594 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812597 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812599 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812603 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812608 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812611 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812616 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812619 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812622 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812625 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812628 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812631 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812634 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812637 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812639 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812642 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812644 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812647 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:14:04.813121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812649 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812652 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812654 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812657 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812659 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812662 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812665 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812667 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812670 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812672 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812675 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812678 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812681 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812683 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812686 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812690 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812692 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812695 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812697 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812700 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:14:04.813614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812703 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812705 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812708 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812711 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812713 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812716 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812718 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812721 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812724 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812726 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812729 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812731 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812734 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812736 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812739 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812741 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812744 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812746 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812750 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812752 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:14:04.814091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812755 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812757 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812760 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812762 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812765 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812768 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812770 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812774 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.812779 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812895 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812900 2567 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812904 2567 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812907 2567 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812909 2567 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812912 2567 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:14:04.814629 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812915 2567 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812918 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812920 2567 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812923 2567 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812925 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812928 2567 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812931 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812934 2567 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812936 2567 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812939 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812941 2567 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812944 2567 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812946 2567 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812949 2567 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812952 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812954 2567 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812957 2567 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812960 2567 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812962 2567 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812965 2567 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:14:04.815003 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812968 2567 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812970 2567 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812972 2567 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812975 2567 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812978 2567 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812981 2567 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812984 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812986 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812990 2567 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812992 2567 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812995 2567 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.812997 2567 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813000 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813002 2567 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813005 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813008 2567 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813011 2567 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813013 2567 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813015 2567 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813018 2567 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:14:04.815491 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813021 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813023 2567 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813026 2567 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813030 2567 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813034 2567 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813037 2567 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813040 2567 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813042 2567 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813045 2567 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813048 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813051 2567 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813053 2567 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813056 2567 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813058 2567 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813061 2567 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813063 2567 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813067 2567 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813069 2567 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813072 2567 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:14:04.816074 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813074 2567 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813077 2567 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813079 2567 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813082 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813085 2567 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813087 2567 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813090 2567 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813092 2567 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813095 2567 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813098 2567 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813102 2567 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813105 2567 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813107 2567 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813110 2567 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813112 2567 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813114 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813117 2567 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813119 2567 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813122 2567 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813125 2567 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:14:04.816538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:04.813128 2567 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:14:04.817100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.813132 2567 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:14:04.817100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.814010 2567 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 21:14:04.817453 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.817437 2567 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 21:14:04.818600 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.818588 2567 server.go:1019] "Starting client certificate rotation" Apr 17 21:14:04.818701 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.818684 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:14:04.818737 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.818721 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:14:04.847874 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.847855 2567 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:14:04.849732 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.849713 2567 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:14:04.862736 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.862713 2567 log.go:25] "Validated CRI v1 runtime API" Apr 17 21:14:04.871469 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.871445 2567 log.go:25] "Validated CRI v1 image API" Apr 17 21:14:04.875845 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.875812 2567 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 21:14:04.878970 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.878949 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:14:04.880231 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.880210 2567 fs.go:135] Filesystem UUIDs: map[06a0d659-408e-4aad-8261-b50eb8b86def:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b2056753-e448-47f7-96b8-6096d7975cf4:/dev/nvme0n1p3] Apr 17 21:14:04.880272 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.880232 2567 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 21:14:04.886430 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.886321 2567 manager.go:217] Machine: {Timestamp:2026-04-17 21:14:04.884394609 +0000 UTC m=+0.453907372 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097252 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a834f061f52a9c3905efae4aab152 SystemUUID:ec2a834f-061f-52a9-c390-5efae4aab152 BootID:32a79b06-92d4-47d8-a5c8-9fcb30139752 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:97:2e:5a:fa:8d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:97:2e:5a:fa:8d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:26:fc:e1:9c:63:b1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 21:14:04.886430 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.886427 2567 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 21:14:04.886578 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.886565 2567 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 21:14:04.889300 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.889276 2567 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 21:14:04.889443 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.889302 2567 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-198.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 21:14:04.889491 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.889452 2567 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 21:14:04.889491 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.889460 2567 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 21:14:04.889491 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.889473 2567 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:14:04.890342 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.890331 2567 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:14:04.891942 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.891932 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:14:04.892062 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.892053 2567 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 21:14:04.894826 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.894817 2567 kubelet.go:491] "Attempting to sync node with API server" Apr 17 21:14:04.894865 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.894830 2567 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 21:14:04.894865 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.894842 2567 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 21:14:04.894865 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.894851 2567 kubelet.go:397] "Adding apiserver pod source" Apr 17 21:14:04.894865 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.894860 2567 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 21:14:04.896045 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.896033 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:14:04.896095 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.896052 2567 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:14:04.899234 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.899221 2567 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 21:14:04.900709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.900696 2567 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 21:14:04.901536 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.901496 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wv78b" Apr 17 21:14:04.903202 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903189 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903206 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903212 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903229 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903239 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903246 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903252 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903258 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903265 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 21:14:04.903268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903271 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 21:14:04.903498 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903284 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 21:14:04.903498 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.903293 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 21:14:04.904280 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.904268 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 21:14:04.904322 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.904280 2567 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 21:14:04.906855 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.906837 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wv78b" Apr 17 21:14:04.907262 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.907246 2567 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-198.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 21:14:04.907404 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:04.907376 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 21:14:04.907462 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:04.907448 2567 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 21:14:04.908239 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.908227 2567 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 21:14:04.908274 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.908264 2567 server.go:1295] "Started kubelet" Apr 17 21:14:04.908435 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.908400 2567 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 21:14:04.908961 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.908892 2567 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 21:14:04.909088 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.908991 2567 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 21:14:04.909065 ip-10-0-134-198 systemd[1]: Started Kubernetes Kubelet. Apr 17 21:14:04.910274 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.910253 2567 server.go:317] "Adding debug handlers to kubelet server" Apr 17 21:14:04.910738 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.910721 2567 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 21:14:04.917299 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.917278 2567 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 21:14:04.917567 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.917283 2567 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 21:14:04.918112 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.918091 2567 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 21:14:04.918112 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.918092 2567 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 21:14:04.918217 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.918127 2567 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 21:14:04.918217 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.918204 2567 reconstruct.go:97] "Volume reconstruction finished" Apr 17 21:14:04.918217 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.918215 2567 reconciler.go:26] "Reconciler: start to sync state" Apr 17 21:14:04.918301 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:04.918244 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:04.919491 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.919470 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:14:04.920448 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.920432 2567 factory.go:153] Registering CRI-O factory Apr 17 21:14:04.920549 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.920452 2567 factory.go:223] Registration of the crio container factory successfully Apr 17 21:14:04.920549 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.920504 2567 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 21:14:04.920549 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.920514 2567 factory.go:55] Registering systemd factory Apr 17 21:14:04.920549 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.920544 2567 factory.go:223] Registration of the systemd container factory successfully Apr 17 21:14:04.920722 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.920577 2567 factory.go:103] Registering Raw factory Apr 17 21:14:04.920722 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.920592 2567 manager.go:1196] Started watching for new ooms in manager Apr 17 21:14:04.921187 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:04.921150 2567 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 21:14:04.921636 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.921618 2567 manager.go:319] Starting recovery of all containers Apr 17 21:14:04.922353 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:04.922333 2567 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-198.ec2.internal\" not found" node="ip-10-0-134-198.ec2.internal" Apr 17 21:14:04.932442 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.932330 2567 manager.go:324] Recovery completed Apr 17 21:14:04.936924 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.936912 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:14:04.939432 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.939417 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:14:04.939481 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.939443 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:14:04.939481 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.939454 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:14:04.939972 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.939952 2567 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 21:14:04.939972 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.939966 2567 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 21:14:04.940054 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.940000 2567 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:14:04.942586 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.942575 2567 policy_none.go:49] "None policy: Start" Apr 17 21:14:04.942642 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.942592 2567 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 21:14:04.942642 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.942602 2567 state_mem.go:35] "Initializing new in-memory state store" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.996217 2567 manager.go:341] "Starting Device Plugin manager" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:04.996277 2567 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.996292 2567 server.go:85] "Starting device plugin registration server" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.996682 2567 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.996699 2567 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.996807 2567 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.996967 2567 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:04.996983 2567 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:04.997717 2567 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 21:14:04.999658 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:04.997760 2567 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.048721 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.048632 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 21:14:05.049975 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.049958 2567 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 21:14:05.050035 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.049988 2567 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 21:14:05.050035 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.050006 2567 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 21:14:05.050035 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.050013 2567 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 21:14:05.050158 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.050045 2567 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 21:14:05.053401 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.053375 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:14:05.097112 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.097080 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:14:05.098253 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.098237 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:14:05.098351 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.098268 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:14:05.098351 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.098280 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:14:05.098351 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.098306 2567 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.106415 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.106398 2567 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.106535 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.106424 2567 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-198.ec2.internal\": node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.131254 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.131228 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.150896 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.150873 2567 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal"] Apr 17 21:14:05.150966 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.150956 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:14:05.152446 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.152428 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:14:05.152556 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.152465 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:14:05.152556 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.152478 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:14:05.154109 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154094 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:14:05.154249 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154228 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.154300 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154262 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:14:05.154833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154807 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:14:05.154833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154817 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:14:05.154833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154826 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:14:05.154833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154838 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:14:05.155031 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154849 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:14:05.155031 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.154838 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:14:05.156278 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.156262 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.156369 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.156286 2567 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:14:05.156894 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.156877 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:14:05.156979 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.156904 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:14:05.156979 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.156915 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:14:05.184217 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.184195 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-198.ec2.internal\" not found" node="ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.188692 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.188676 2567 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-198.ec2.internal\" not found" node="ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.219380 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.219356 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15e91d92da8dd96c74a850fe41324fae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal\" (UID: \"15e91d92da8dd96c74a850fe41324fae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.219460 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.219384 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15e91d92da8dd96c74a850fe41324fae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal\" (UID: \"15e91d92da8dd96c74a850fe41324fae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.219460 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.219401 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fdcbae07efe3e8c40fcc3f75d7de6766-config\") pod \"kube-apiserver-proxy-ip-10-0-134-198.ec2.internal\" (UID: \"fdcbae07efe3e8c40fcc3f75d7de6766\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.232224 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.232203 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.319680 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.319612 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15e91d92da8dd96c74a850fe41324fae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal\" (UID: \"15e91d92da8dd96c74a850fe41324fae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.319680 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.319649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15e91d92da8dd96c74a850fe41324fae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal\" (UID: \"15e91d92da8dd96c74a850fe41324fae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.319780 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.319712 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/15e91d92da8dd96c74a850fe41324fae-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal\" (UID: \"15e91d92da8dd96c74a850fe41324fae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.319780 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.319766 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fdcbae07efe3e8c40fcc3f75d7de6766-config\") pod \"kube-apiserver-proxy-ip-10-0-134-198.ec2.internal\" (UID: \"fdcbae07efe3e8c40fcc3f75d7de6766\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.319838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.319786 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/15e91d92da8dd96c74a850fe41324fae-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal\" (UID: \"15e91d92da8dd96c74a850fe41324fae\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.319838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.319799 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fdcbae07efe3e8c40fcc3f75d7de6766-config\") pod \"kube-apiserver-proxy-ip-10-0-134-198.ec2.internal\" (UID: \"fdcbae07efe3e8c40fcc3f75d7de6766\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.332671 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.332650 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.433410 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.433376 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.486575 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.486553 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.491190 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.491162 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" Apr 17 21:14:05.534676 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.534633 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.635276 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.635208 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.735727 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.735701 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.817940 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.817914 2567 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 21:14:05.818486 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.818072 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:14:05.818486 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.818094 2567 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:14:05.836268 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.836248 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.908499 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.908386 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 21:09:04 +0000 UTC" deadline="2027-09-13 23:30:59.687182031 +0000 UTC" Apr 17 21:14:05.908499 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.908420 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12338h16m53.778764418s" Apr 17 21:14:05.917629 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.917614 2567 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 21:14:05.926465 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.926445 2567 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:14:05.937226 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:05.937203 2567 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-198.ec2.internal\" not found" Apr 17 21:14:05.944493 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.944473 2567 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2wfh5" Apr 17 21:14:05.952464 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.952449 2567 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2wfh5" Apr 17 21:14:05.960025 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:05.960005 2567 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:14:06.018263 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.018233 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" Apr 17 21:14:06.031475 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.031453 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:14:06.033336 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.033318 2567 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" Apr 17 21:14:06.044945 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.044925 2567 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:14:06.086919 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:06.086870 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15e91d92da8dd96c74a850fe41324fae.slice/crio-5699cbb8ea7d274ef3b30bf4ae21b5443c720f3e761d8b5b3a9d39a430f3086f WatchSource:0}: Error finding container 5699cbb8ea7d274ef3b30bf4ae21b5443c720f3e761d8b5b3a9d39a430f3086f: Status 404 returned error can't find the container with id 5699cbb8ea7d274ef3b30bf4ae21b5443c720f3e761d8b5b3a9d39a430f3086f Apr 17 21:14:06.087884 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:06.087844 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdcbae07efe3e8c40fcc3f75d7de6766.slice/crio-f971df1dbed1cb65140ff00ffdf83f54e5ffeec4c1d562dac4551b24e9f4046d WatchSource:0}: Error finding container f971df1dbed1cb65140ff00ffdf83f54e5ffeec4c1d562dac4551b24e9f4046d: Status 404 returned error can't find the container with id f971df1dbed1cb65140ff00ffdf83f54e5ffeec4c1d562dac4551b24e9f4046d Apr 17 21:14:06.093365 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.093348 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:14:06.391186 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.391154 2567 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:14:06.857276 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.857228 2567 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:14:06.896503 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.896477 2567 apiserver.go:52] "Watching apiserver" Apr 17 21:14:06.904267 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.904238 2567 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 21:14:06.904725 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.904700 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-n4q7j","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm","openshift-cluster-node-tuning-operator/tuned-6gd22","openshift-image-registry/node-ca-kv4pm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal","openshift-multus/multus-additional-cni-plugins-xl898","openshift-network-operator/iptables-alerter-4bfg5","openshift-ovn-kubernetes/ovnkube-node-hn2f5","kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal","openshift-dns/node-resolver-khzwm","openshift-multus/multus-nqctl","openshift-multus/network-metrics-daemon-ndfzt","openshift-network-diagnostics/network-check-target-ph67v"] Apr 17 21:14:06.907636 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.907611 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:06.907778 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.907752 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:06.910427 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.910352 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.911407 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.911245 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lk4tq\"" Apr 17 21:14:06.911407 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.911284 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f872d\"" Apr 17 21:14:06.911407 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.911281 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 21:14:06.911407 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.911303 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 21:14:06.911685 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.911573 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 21:14:06.912587 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.912566 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ntzjq\"" Apr 17 21:14:06.912690 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.912660 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 21:14:06.912792 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.912777 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 21:14:06.912875 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.912859 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 21:14:06.912931 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.912780 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:14:06.914420 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.914404 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:06.915055 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.914923 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:14:06.915798 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.915769 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.915900 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.915852 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:06.916563 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.916546 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 21:14:06.916766 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.916747 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 21:14:06.917145 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.916932 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5qs5g\"" Apr 17 21:14:06.917145 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.916983 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 21:14:06.917291 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.917241 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.917879 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.917857 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 21:14:06.920768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.919208 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zkp65\"" Apr 17 21:14:06.920768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.919298 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 21:14:06.920768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.919435 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 21:14:06.920768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.920158 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:06.920768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.920274 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.920768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.920336 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hv2kl\"" Apr 17 21:14:06.920768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.919956 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 21:14:06.921098 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.920790 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 21:14:06.921098 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.921015 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 21:14:06.921196 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.921115 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 21:14:06.921196 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.921145 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 21:14:06.921863 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.921697 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 21:14:06.921863 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.921727 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 21:14:06.921863 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.921737 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 21:14:06.922071 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.922028 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 21:14:06.922332 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.922180 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 21:14:06.922412 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.922362 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-fdsrs\"" Apr 17 21:14:06.922773 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.922757 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 21:14:06.923053 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.922768 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 21:14:06.923143 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.922851 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 21:14:06.923201 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.922892 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-t8jv7\"" Apr 17 21:14:06.923481 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.923452 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8cjps\"" Apr 17 21:14:06.923747 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.923730 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:06.923827 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:06.923807 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:06.924052 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.924029 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:06.924124 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:06.924103 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:06.926506 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926484 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-ovnkube-config\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.926600 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926538 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-os-release\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.926600 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926566 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-k8s-cni-cncf-io\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.926600 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926591 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-cni-bin\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.926715 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.926715 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926660 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-host\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.926805 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926710 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-tmp-dir\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:06.926805 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926741 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-device-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:06.926805 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926767 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn4nj\" (UniqueName: \"kubernetes.io/projected/bf0796a8-757c-48fe-ba20-dbb6e93de179-kube-api-access-gn4nj\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.926805 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926794 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-node-log\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.926982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926820 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-iptables-alerter-script\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:06.926982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926843 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-cnibin\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.926982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b32da7b-3f9e-431b-b417-68d5b307bbd0-cni-binary-copy\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.926982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926914 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-host-slash\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:06.926982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926939 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b17b894-59e3-497a-832e-05720d6d30d8-serviceca\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:06.927195 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.926974 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-systemd\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.927195 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-run\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.927195 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927067 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cni-binary-copy\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.927195 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927118 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-hosts-file\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:06.927195 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927144 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-kubernetes\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.927195 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927167 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gmz7\" (UniqueName: \"kubernetes.io/projected/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-kube-api-access-5gmz7\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:06.927195 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927191 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d08d6c80-4c67-4cea-8f12-12b55c526b6d-konnectivity-ca\") pod \"konnectivity-agent-n4q7j\" (UID: \"d08d6c80-4c67-4cea-8f12-12b55c526b6d\") " pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:06.927503 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927213 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-slash\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.927503 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927237 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-run-netns\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.927503 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927261 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-log-socket\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.927503 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927283 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-cni-netd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.927503 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927307 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-env-overrides\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.927503 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927345 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-netns\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.927873 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927814 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-hostroot\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.927873 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927858 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b17b894-59e3-497a-832e-05720d6d30d8-host\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:06.928022 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927882 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjznd\" (UniqueName: \"kubernetes.io/projected/2dfc2041-df04-460a-9385-f9b334671d62-kube-api-access-wjznd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.928022 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927905 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-cni-multus\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.928022 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927928 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-os-release\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.928022 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.927954 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.928223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928019 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-socket-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:06.928223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-sys-fs\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:06.928223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928101 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysconfig\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.928223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928135 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-etc-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.928223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928162 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.928223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928186 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-run-ovn-kubernetes\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.928223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928209 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-kubelet\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928235 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-etc-kubernetes\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928257 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjv4r\" (UniqueName: \"kubernetes.io/projected/5b17b894-59e3-497a-832e-05720d6d30d8-kube-api-access-hjv4r\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928291 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928314 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d08d6c80-4c67-4cea-8f12-12b55c526b6d-agent-certs\") pod \"konnectivity-agent-n4q7j\" (UID: \"d08d6c80-4c67-4cea-8f12-12b55c526b6d\") " pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928340 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-system-cni-dir\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928363 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928394 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-modprobe-d\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928412 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-sys\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928428 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-lib-modules\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928449 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdrxc\" (UniqueName: \"kubernetes.io/projected/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-kube-api-access-cdrxc\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-var-lib-kubelet\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928497 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2dfc2041-df04-460a-9385-f9b334671d62-ovn-node-metrics-cert\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928538 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-system-cni-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928560 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-multus-certs\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.928592 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928584 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcksj\" (UniqueName: \"kubernetes.io/projected/49db8cb9-f293-4fa9-8608-10aa95283255-kube-api-access-lcksj\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928607 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysctl-conf\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928629 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-tuned\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928652 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf0796a8-757c-48fe-ba20-dbb6e93de179-tmp\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928673 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-ovn\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928697 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-cni-bin\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-conf-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928743 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-daemon-config\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928765 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysctl-d\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928787 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6vvt\" (UniqueName: \"kubernetes.io/projected/6d1fc6dd-6533-4846-82a8-55fc0feb006f-kube-api-access-d6vvt\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928849 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-systemd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928903 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.928963 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-ovnkube-script-lib\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929005 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-cni-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929050 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgvd4\" (UniqueName: \"kubernetes.io/projected/9b32da7b-3f9e-431b-b417-68d5b307bbd0-kube-api-access-pgvd4\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-registration-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:06.929455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929117 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-etc-selinux\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:06.930002 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929140 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-kubelet\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.930002 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929172 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-systemd-units\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.930002 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929205 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-socket-dir-parent\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:06.930002 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929233 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cnibin\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:06.930002 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.929259 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-var-lib-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:06.954581 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.954554 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:09:05 +0000 UTC" deadline="2027-09-16 01:22:19.190126215 +0000 UTC" Apr 17 21:14:06.954581 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:06.954579 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12388h8m12.235549447s" Apr 17 21:14:07.019410 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.019382 2567 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 21:14:07.029486 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029455 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-node-log\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.029486 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-iptables-alerter-script\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029540 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-cnibin\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029586 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-node-log\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029614 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-cnibin\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029627 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b32da7b-3f9e-431b-b417-68d5b307bbd0-cni-binary-copy\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029661 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-host-slash\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029684 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b17b894-59e3-497a-832e-05720d6d30d8-serviceca\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029708 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-systemd\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-host-slash\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:07.029729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029732 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-run\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029761 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cni-binary-copy\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029784 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-hosts-file\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-kubernetes\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029834 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gmz7\" (UniqueName: \"kubernetes.io/projected/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-kube-api-access-5gmz7\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029857 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d08d6c80-4c67-4cea-8f12-12b55c526b6d-konnectivity-ca\") pod \"konnectivity-agent-n4q7j\" (UID: \"d08d6c80-4c67-4cea-8f12-12b55c526b6d\") " pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029881 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-slash\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029905 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-run-netns\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029926 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-log-socket\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029949 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-cni-netd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029974 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-env-overrides\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.029996 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-netns\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030019 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-hostroot\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030044 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b17b894-59e3-497a-832e-05720d6d30d8-host\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030068 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjznd\" (UniqueName: \"kubernetes.io/projected/2dfc2041-df04-460a-9385-f9b334671d62-kube-api-access-wjznd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030093 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-cni-multus\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030146 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-cni-multus\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.030165 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030153 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-slash\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030182 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5b17b894-59e3-497a-832e-05720d6d30d8-serviceca\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030181 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-iptables-alerter-script\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030200 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-log-socket\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030206 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-run-netns\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030221 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-netns\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030244 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5b17b894-59e3-497a-832e-05720d6d30d8-host\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030263 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-cni-netd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030283 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-systemd\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030318 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-kubernetes\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030321 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-hostroot\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030360 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-run\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030487 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b32da7b-3f9e-431b-b417-68d5b307bbd0-cni-binary-copy\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030599 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-os-release\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030638 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030666 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-socket-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030694 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-env-overrides\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030712 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-sys-fs\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.031047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-hosts-file\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030735 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysconfig\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030769 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030781 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-os-release\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030795 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-etc-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030820 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030836 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-sys-fs\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030861 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cni-binary-copy\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-run-ovn-kubernetes\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030914 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-etc-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030924 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysconfig\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-socket-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030917 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-kubelet\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030954 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030973 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-kubelet\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030970 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-etc-kubernetes\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031012 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-etc-kubernetes\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.031902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.030974 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-run-ovn-kubernetes\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjv4r\" (UniqueName: \"kubernetes.io/projected/5b17b894-59e3-497a-832e-05720d6d30d8-kube-api-access-hjv4r\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031055 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031079 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d08d6c80-4c67-4cea-8f12-12b55c526b6d-agent-certs\") pod \"konnectivity-agent-n4q7j\" (UID: \"d08d6c80-4c67-4cea-8f12-12b55c526b6d\") " pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031103 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-system-cni-dir\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-system-cni-dir\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031264 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031279 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031293 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-modprobe-d\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031301 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031315 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-sys\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031367 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-sys\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031376 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-lib-modules\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdrxc\" (UniqueName: \"kubernetes.io/projected/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-kube-api-access-cdrxc\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031427 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-modprobe-d\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031441 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-var-lib-kubelet\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.032672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031475 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-var-lib-kubelet\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031513 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2dfc2041-df04-460a-9385-f9b334671d62-ovn-node-metrics-cert\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031511 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-lib-modules\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-system-cni-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031578 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-multus-certs\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031603 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcksj\" (UniqueName: \"kubernetes.io/projected/49db8cb9-f293-4fa9-8608-10aa95283255-kube-api-access-lcksj\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031628 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysctl-conf\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031652 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-tuned\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031674 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-multus-certs\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031678 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf0796a8-757c-48fe-ba20-dbb6e93de179-tmp\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d08d6c80-4c67-4cea-8f12-12b55c526b6d-konnectivity-ca\") pod \"konnectivity-agent-n4q7j\" (UID: \"d08d6c80-4c67-4cea-8f12-12b55c526b6d\") " pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031719 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-ovn\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031780 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-cni-bin\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031801 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-conf-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031818 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysctl-conf\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031823 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-daemon-config\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031651 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-system-cni-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.033428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031865 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysctl-d\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031895 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6vvt\" (UniqueName: \"kubernetes.io/projected/6d1fc6dd-6533-4846-82a8-55fc0feb006f-kube-api-access-d6vvt\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031922 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-systemd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031924 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-cni-bin\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031967 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-systemd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.031963 2567 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032004 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-run-ovn\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032030 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032056 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-ovnkube-script-lib\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032077 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-cni-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032099 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgvd4\" (UniqueName: \"kubernetes.io/projected/9b32da7b-3f9e-431b-b417-68d5b307bbd0-kube-api-access-pgvd4\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032120 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-registration-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032145 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-etc-selinux\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032165 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-sysctl-d\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032210 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-daemon-config\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032214 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032218 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-registration-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.034237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032231 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-cni-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032169 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-kubelet\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032272 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-systemd-units\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032323 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-etc-selinux\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032327 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-systemd-units\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032346 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-socket-dir-parent\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032372 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cnibin\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032374 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-host-kubelet\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032398 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-var-lib-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032427 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-ovnkube-config\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032431 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-socket-dir-parent\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032428 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-multus-conf-dir\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032455 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2dfc2041-df04-460a-9385-f9b334671d62-var-lib-openvswitch\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-os-release\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032466 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6d1fc6dd-6533-4846-82a8-55fc0feb006f-cnibin\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032491 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-k8s-cni-cncf-io\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032535 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-cni-bin\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-run-k8s-cni-cncf-io\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035047 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032552 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-os-release\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032578 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032605 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-host\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9b32da7b-3f9e-431b-b417-68d5b307bbd0-host-var-lib-cni-bin\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032630 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-tmp-dir\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032669 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-device-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032679 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf0796a8-757c-48fe-ba20-dbb6e93de179-host\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032694 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn4nj\" (UniqueName: \"kubernetes.io/projected/bf0796a8-757c-48fe-ba20-dbb6e93de179-kube-api-access-gn4nj\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032723 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztcch\" (UniqueName: \"kubernetes.io/projected/95290018-54bc-46b1-8b24-b0bae6086a51-kube-api-access-ztcch\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032797 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-ovnkube-script-lib\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49db8cb9-f293-4fa9-8608-10aa95283255-device-dir\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032912 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2dfc2041-df04-460a-9385-f9b334671d62-ovnkube-config\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.032961 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-tmp-dir\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:07.035692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.033634 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6d1fc6dd-6533-4846-82a8-55fc0feb006f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.036281 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.035935 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf0796a8-757c-48fe-ba20-dbb6e93de179-etc-tuned\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.036281 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.035976 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf0796a8-757c-48fe-ba20-dbb6e93de179-tmp\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.036281 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.036197 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2dfc2041-df04-460a-9385-f9b334671d62-ovn-node-metrics-cert\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.036885 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.036782 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d08d6c80-4c67-4cea-8f12-12b55c526b6d-agent-certs\") pod \"konnectivity-agent-n4q7j\" (UID: \"d08d6c80-4c67-4cea-8f12-12b55c526b6d\") " pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:07.038844 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.038754 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gmz7\" (UniqueName: \"kubernetes.io/projected/162cd45c-08b2-4eec-81a1-8cab6d1ebbeb-kube-api-access-5gmz7\") pod \"iptables-alerter-4bfg5\" (UID: \"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb\") " pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:07.038844 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.038805 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjv4r\" (UniqueName: \"kubernetes.io/projected/5b17b894-59e3-497a-832e-05720d6d30d8-kube-api-access-hjv4r\") pod \"node-ca-kv4pm\" (UID: \"5b17b894-59e3-497a-832e-05720d6d30d8\") " pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:07.038998 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.038956 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjznd\" (UniqueName: \"kubernetes.io/projected/2dfc2041-df04-460a-9385-f9b334671d62-kube-api-access-wjznd\") pod \"ovnkube-node-hn2f5\" (UID: \"2dfc2041-df04-460a-9385-f9b334671d62\") " pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.039912 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.039872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdrxc\" (UniqueName: \"kubernetes.io/projected/e067c0b5-668b-46b2-855d-e7cc2d5b9db4-kube-api-access-cdrxc\") pod \"node-resolver-khzwm\" (UID: \"e067c0b5-668b-46b2-855d-e7cc2d5b9db4\") " pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:07.040673 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.040649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcksj\" (UniqueName: \"kubernetes.io/projected/49db8cb9-f293-4fa9-8608-10aa95283255-kube-api-access-lcksj\") pod \"aws-ebs-csi-driver-node-qmjfm\" (UID: \"49db8cb9-f293-4fa9-8608-10aa95283255\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.041376 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.041332 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn4nj\" (UniqueName: \"kubernetes.io/projected/bf0796a8-757c-48fe-ba20-dbb6e93de179-kube-api-access-gn4nj\") pod \"tuned-6gd22\" (UID: \"bf0796a8-757c-48fe-ba20-dbb6e93de179\") " pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.041376 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.041360 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6vvt\" (UniqueName: \"kubernetes.io/projected/6d1fc6dd-6533-4846-82a8-55fc0feb006f-kube-api-access-d6vvt\") pod \"multus-additional-cni-plugins-xl898\" (UID: \"6d1fc6dd-6533-4846-82a8-55fc0feb006f\") " pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.042953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.042934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgvd4\" (UniqueName: \"kubernetes.io/projected/9b32da7b-3f9e-431b-b417-68d5b307bbd0-kube-api-access-pgvd4\") pod \"multus-nqctl\" (UID: \"9b32da7b-3f9e-431b-b417-68d5b307bbd0\") " pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.054101 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.054047 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" event={"ID":"15e91d92da8dd96c74a850fe41324fae","Type":"ContainerStarted","Data":"5699cbb8ea7d274ef3b30bf4ae21b5443c720f3e761d8b5b3a9d39a430f3086f"} Apr 17 21:14:07.055152 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.055124 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" event={"ID":"fdcbae07efe3e8c40fcc3f75d7de6766","Type":"ContainerStarted","Data":"f971df1dbed1cb65140ff00ffdf83f54e5ffeec4c1d562dac4551b24e9f4046d"} Apr 17 21:14:07.133915 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.133811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:07.133915 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.133882 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztcch\" (UniqueName: \"kubernetes.io/projected/95290018-54bc-46b1-8b24-b0bae6086a51-kube-api-access-ztcch\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:07.134142 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.133925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:07.134142 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.134052 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:07.134142 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.134141 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs podName:95290018-54bc-46b1-8b24-b0bae6086a51 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:07.634115638 +0000 UTC m=+3.203628380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs") pod "network-metrics-daemon-ndfzt" (UID: "95290018-54bc-46b1-8b24-b0bae6086a51") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:07.140285 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.140249 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:14:07.140285 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.140275 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:14:07.140285 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.140290 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zh2mv for pod openshift-network-diagnostics/network-check-target-ph67v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:07.140555 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.140369 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv podName:efa382f4-9974-4993-ae95-7a0c981d06ab nodeName:}" failed. No retries permitted until 2026-04-17 21:14:07.64034054 +0000 UTC m=+3.209853276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zh2mv" (UniqueName: "kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv") pod "network-check-target-ph67v" (UID: "efa382f4-9974-4993-ae95-7a0c981d06ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:07.142728 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.142695 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztcch\" (UniqueName: \"kubernetes.io/projected/95290018-54bc-46b1-8b24-b0bae6086a51-kube-api-access-ztcch\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:07.224623 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.224591 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4bfg5" Apr 17 21:14:07.232404 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.232373 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" Apr 17 21:14:07.242213 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.242187 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6gd22" Apr 17 21:14:07.249754 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.249731 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kv4pm" Apr 17 21:14:07.256811 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.256791 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xl898" Apr 17 21:14:07.264383 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.264364 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:07.279059 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.279033 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:07.287750 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.287727 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-khzwm" Apr 17 21:14:07.292377 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.292343 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nqctl" Apr 17 21:14:07.309600 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.309571 2567 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:14:07.637654 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.637623 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:07.637822 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.637769 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:07.637880 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.637850 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs podName:95290018-54bc-46b1-8b24-b0bae6086a51 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:08.637828524 +0000 UTC m=+4.207341263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs") pod "network-metrics-daemon-ndfzt" (UID: "95290018-54bc-46b1-8b24-b0bae6086a51") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:07.738910 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.738876 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:07.739101 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.739006 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:14:07.739101 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.739020 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:14:07.739101 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.739029 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zh2mv for pod openshift-network-diagnostics/network-check-target-ph67v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:07.739101 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:07.739079 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv podName:efa382f4-9974-4993-ae95-7a0c981d06ab nodeName:}" failed. No retries permitted until 2026-04-17 21:14:08.739065284 +0000 UTC m=+4.308578016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zh2mv" (UniqueName: "kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv") pod "network-check-target-ph67v" (UID: "efa382f4-9974-4993-ae95-7a0c981d06ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:07.761932 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.761898 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode067c0b5_668b_46b2_855d_e7cc2d5b9db4.slice/crio-fccf82c2938dddcd5b7e7888226a3c0237a26c87eb2a928865049438bf99a398 WatchSource:0}: Error finding container fccf82c2938dddcd5b7e7888226a3c0237a26c87eb2a928865049438bf99a398: Status 404 returned error can't find the container with id fccf82c2938dddcd5b7e7888226a3c0237a26c87eb2a928865049438bf99a398 Apr 17 21:14:07.763883 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.763854 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b17b894_59e3_497a_832e_05720d6d30d8.slice/crio-c516dea9c43f69cf4e7d1a6579d3fd4cc613ac81052a8e90d9088cc7361ca7ed WatchSource:0}: Error finding container c516dea9c43f69cf4e7d1a6579d3fd4cc613ac81052a8e90d9088cc7361ca7ed: Status 404 returned error can't find the container with id c516dea9c43f69cf4e7d1a6579d3fd4cc613ac81052a8e90d9088cc7361ca7ed Apr 17 21:14:07.767634 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.767606 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b32da7b_3f9e_431b_b417_68d5b307bbd0.slice/crio-74b0b2789640cdafd297b4b9b02b646f8ddc3580e9eb9d08b1f642402fce7618 WatchSource:0}: Error finding container 74b0b2789640cdafd297b4b9b02b646f8ddc3580e9eb9d08b1f642402fce7618: Status 404 returned error can't find the container with id 74b0b2789640cdafd297b4b9b02b646f8ddc3580e9eb9d08b1f642402fce7618 Apr 17 21:14:07.768440 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.768419 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd08d6c80_4c67_4cea_8f12_12b55c526b6d.slice/crio-dc91beb40f2405450d5efac0de802f6aa0eba91f2146056d7bf859c27b387f87 WatchSource:0}: Error finding container dc91beb40f2405450d5efac0de802f6aa0eba91f2146056d7bf859c27b387f87: Status 404 returned error can't find the container with id dc91beb40f2405450d5efac0de802f6aa0eba91f2146056d7bf859c27b387f87 Apr 17 21:14:07.770043 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.770017 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0796a8_757c_48fe_ba20_dbb6e93de179.slice/crio-050b272ff41cb77ede4b8f8535d5904bee8dd42f5ac08d3623366d5e20888e4f WatchSource:0}: Error finding container 050b272ff41cb77ede4b8f8535d5904bee8dd42f5ac08d3623366d5e20888e4f: Status 404 returned error can't find the container with id 050b272ff41cb77ede4b8f8535d5904bee8dd42f5ac08d3623366d5e20888e4f Apr 17 21:14:07.771962 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.771743 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1fc6dd_6533_4846_82a8_55fc0feb006f.slice/crio-0f4d3a7341806a76136ade7b0e2062cdbb2e78b3c2d2599679e6f2c218419736 WatchSource:0}: Error finding container 0f4d3a7341806a76136ade7b0e2062cdbb2e78b3c2d2599679e6f2c218419736: Status 404 returned error can't find the container with id 0f4d3a7341806a76136ade7b0e2062cdbb2e78b3c2d2599679e6f2c218419736 Apr 17 21:14:07.772734 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.772712 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dfc2041_df04_460a_9385_f9b334671d62.slice/crio-b5a6f8165fb83414cc1b7045007339b6fb61a93695e593876d470beef5da91fb WatchSource:0}: Error finding container b5a6f8165fb83414cc1b7045007339b6fb61a93695e593876d470beef5da91fb: Status 404 returned error can't find the container with id b5a6f8165fb83414cc1b7045007339b6fb61a93695e593876d470beef5da91fb Apr 17 21:14:07.773689 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.773668 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod162cd45c_08b2_4eec_81a1_8cab6d1ebbeb.slice/crio-417e35d004243608f74ae21187da36b5b3c4c2e53280b08f8af0657dc1af35fa WatchSource:0}: Error finding container 417e35d004243608f74ae21187da36b5b3c4c2e53280b08f8af0657dc1af35fa: Status 404 returned error can't find the container with id 417e35d004243608f74ae21187da36b5b3c4c2e53280b08f8af0657dc1af35fa Apr 17 21:14:07.775162 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:14:07.775126 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49db8cb9_f293_4fa9_8608_10aa95283255.slice/crio-0d282f4e97ea855d10a5569024d689730d25eeb0077cbbcfe03eb06233b74edf WatchSource:0}: Error finding container 0d282f4e97ea855d10a5569024d689730d25eeb0077cbbcfe03eb06233b74edf: Status 404 returned error can't find the container with id 0d282f4e97ea855d10a5569024d689730d25eeb0077cbbcfe03eb06233b74edf Apr 17 21:14:07.955714 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.955606 2567 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:09:05 +0000 UTC" deadline="2028-01-27 17:22:01.75160083 +0000 UTC" Apr 17 21:14:07.955714 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:07.955636 2567 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15596h7m53.795967491s" Apr 17 21:14:08.057246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.057215 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6gd22" event={"ID":"bf0796a8-757c-48fe-ba20-dbb6e93de179","Type":"ContainerStarted","Data":"050b272ff41cb77ede4b8f8535d5904bee8dd42f5ac08d3623366d5e20888e4f"} Apr 17 21:14:08.058172 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.058145 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-n4q7j" event={"ID":"d08d6c80-4c67-4cea-8f12-12b55c526b6d","Type":"ContainerStarted","Data":"dc91beb40f2405450d5efac0de802f6aa0eba91f2146056d7bf859c27b387f87"} Apr 17 21:14:08.059127 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.059100 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nqctl" event={"ID":"9b32da7b-3f9e-431b-b417-68d5b307bbd0","Type":"ContainerStarted","Data":"74b0b2789640cdafd297b4b9b02b646f8ddc3580e9eb9d08b1f642402fce7618"} Apr 17 21:14:08.062923 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.062900 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" event={"ID":"fdcbae07efe3e8c40fcc3f75d7de6766","Type":"ContainerStarted","Data":"e77f30a84cb1768e9ef6d43c4bb8ecd4272733969f9f7d53619e90d7db7efc6b"} Apr 17 21:14:08.063925 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.063906 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" event={"ID":"49db8cb9-f293-4fa9-8608-10aa95283255","Type":"ContainerStarted","Data":"0d282f4e97ea855d10a5569024d689730d25eeb0077cbbcfe03eb06233b74edf"} Apr 17 21:14:08.064812 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.064795 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"b5a6f8165fb83414cc1b7045007339b6fb61a93695e593876d470beef5da91fb"} Apr 17 21:14:08.065771 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.065742 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kv4pm" event={"ID":"5b17b894-59e3-497a-832e-05720d6d30d8","Type":"ContainerStarted","Data":"c516dea9c43f69cf4e7d1a6579d3fd4cc613ac81052a8e90d9088cc7361ca7ed"} Apr 17 21:14:08.066738 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.066706 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-khzwm" event={"ID":"e067c0b5-668b-46b2-855d-e7cc2d5b9db4","Type":"ContainerStarted","Data":"fccf82c2938dddcd5b7e7888226a3c0237a26c87eb2a928865049438bf99a398"} Apr 17 21:14:08.067702 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.067681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4bfg5" event={"ID":"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb","Type":"ContainerStarted","Data":"417e35d004243608f74ae21187da36b5b3c4c2e53280b08f8af0657dc1af35fa"} Apr 17 21:14:08.068555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.068535 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xl898" event={"ID":"6d1fc6dd-6533-4846-82a8-55fc0feb006f","Type":"ContainerStarted","Data":"0f4d3a7341806a76136ade7b0e2062cdbb2e78b3c2d2599679e6f2c218419736"} Apr 17 21:14:08.076449 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.076411 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-198.ec2.internal" podStartSLOduration=2.076400966 podStartE2EDuration="2.076400966s" podCreationTimestamp="2026-04-17 21:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:14:08.075793095 +0000 UTC m=+3.645305847" watchObservedRunningTime="2026-04-17 21:14:08.076400966 +0000 UTC m=+3.645913718" Apr 17 21:14:08.648848 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.648799 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:08.649031 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:08.648976 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:08.649096 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:08.649046 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs podName:95290018-54bc-46b1-8b24-b0bae6086a51 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:10.649025935 +0000 UTC m=+6.218538666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs") pod "network-metrics-daemon-ndfzt" (UID: "95290018-54bc-46b1-8b24-b0bae6086a51") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:08.750277 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:08.749592 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:08.750277 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:08.749773 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:14:08.750277 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:08.749792 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:14:08.750277 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:08.749804 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zh2mv for pod openshift-network-diagnostics/network-check-target-ph67v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:08.750277 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:08.749864 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv podName:efa382f4-9974-4993-ae95-7a0c981d06ab nodeName:}" failed. No retries permitted until 2026-04-17 21:14:10.74984362 +0000 UTC m=+6.319356352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zh2mv" (UniqueName: "kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv") pod "network-check-target-ph67v" (UID: "efa382f4-9974-4993-ae95-7a0c981d06ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:09.052898 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:09.052789 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:09.053375 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:09.052940 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:09.053375 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:09.053296 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:09.053511 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:09.053411 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:09.083546 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:09.082820 2567 generic.go:358] "Generic (PLEG): container finished" podID="15e91d92da8dd96c74a850fe41324fae" containerID="3511dea7dcc8ad01aefeb75e030563ed72947635f12b66aaf8e658db194b5fbe" exitCode=0 Apr 17 21:14:09.083869 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:09.083793 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" event={"ID":"15e91d92da8dd96c74a850fe41324fae","Type":"ContainerDied","Data":"3511dea7dcc8ad01aefeb75e030563ed72947635f12b66aaf8e658db194b5fbe"} Apr 17 21:14:10.104708 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:10.104667 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" event={"ID":"15e91d92da8dd96c74a850fe41324fae","Type":"ContainerStarted","Data":"d825edd5a480a49b41b809f9bd7774b8d256e3bcc1dc67606282732fef829af3"} Apr 17 21:14:10.667775 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:10.667734 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:10.667950 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:10.667881 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:10.667950 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:10.667948 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs podName:95290018-54bc-46b1-8b24-b0bae6086a51 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:14.667929011 +0000 UTC m=+10.237441749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs") pod "network-metrics-daemon-ndfzt" (UID: "95290018-54bc-46b1-8b24-b0bae6086a51") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:10.768122 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:10.768083 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:10.768304 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:10.768290 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:14:10.768364 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:10.768309 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:14:10.768364 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:10.768321 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zh2mv for pod openshift-network-diagnostics/network-check-target-ph67v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:10.768468 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:10.768379 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv podName:efa382f4-9974-4993-ae95-7a0c981d06ab nodeName:}" failed. No retries permitted until 2026-04-17 21:14:14.768360367 +0000 UTC m=+10.337873110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zh2mv" (UniqueName: "kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv") pod "network-check-target-ph67v" (UID: "efa382f4-9974-4993-ae95-7a0c981d06ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:11.051272 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:11.051189 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:11.051443 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:11.051336 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:11.051782 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:11.051759 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:11.051918 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:11.051874 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:13.051134 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:13.051098 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:13.051576 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:13.051233 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:13.051576 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:13.051272 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:13.051576 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:13.051351 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:14.703084 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:14.703011 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:14.703597 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:14.703152 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:14.703597 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:14.703223 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs podName:95290018-54bc-46b1-8b24-b0bae6086a51 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:22.703206797 +0000 UTC m=+18.272719529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs") pod "network-metrics-daemon-ndfzt" (UID: "95290018-54bc-46b1-8b24-b0bae6086a51") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:14.803452 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:14.803412 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:14.803643 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:14.803618 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:14:14.803643 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:14.803636 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:14:14.803779 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:14.803650 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zh2mv for pod openshift-network-diagnostics/network-check-target-ph67v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:14.803779 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:14.803707 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv podName:efa382f4-9974-4993-ae95-7a0c981d06ab nodeName:}" failed. No retries permitted until 2026-04-17 21:14:22.80368912 +0000 UTC m=+18.373201857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zh2mv" (UniqueName: "kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv") pod "network-check-target-ph67v" (UID: "efa382f4-9974-4993-ae95-7a0c981d06ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:15.052025 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:15.051775 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:15.052025 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:15.051908 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:15.052289 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:15.052264 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:15.052414 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:15.052364 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:17.050918 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:17.050882 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:17.051368 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:17.050882 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:17.051368 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:17.051002 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:17.051368 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:17.051102 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:19.050865 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:19.050829 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:19.051331 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:19.050876 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:19.051331 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:19.050971 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:19.051331 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:19.051091 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:21.051249 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:21.051143 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:21.051249 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:21.051167 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:21.051832 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:21.051296 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:21.051832 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:21.051387 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:22.763914 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:22.763861 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:22.764378 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:22.764043 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:22.764378 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:22.764128 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs podName:95290018-54bc-46b1-8b24-b0bae6086a51 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:38.76410904 +0000 UTC m=+34.333621770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs") pod "network-metrics-daemon-ndfzt" (UID: "95290018-54bc-46b1-8b24-b0bae6086a51") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:22.864642 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:22.864601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:22.864810 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:22.864750 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:14:22.864810 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:22.864765 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:14:22.864810 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:22.864774 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zh2mv for pod openshift-network-diagnostics/network-check-target-ph67v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:22.864968 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:22.864837 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv podName:efa382f4-9974-4993-ae95-7a0c981d06ab nodeName:}" failed. No retries permitted until 2026-04-17 21:14:38.864820425 +0000 UTC m=+34.434333157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zh2mv" (UniqueName: "kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv") pod "network-check-target-ph67v" (UID: "efa382f4-9974-4993-ae95-7a0c981d06ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:23.050820 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:23.050738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:23.050962 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:23.050742 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:23.050962 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:23.050888 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:23.051109 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:23.050964 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:25.052000 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.051287 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:25.052000 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.051289 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:25.052000 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:25.051654 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:25.052000 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:25.051752 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:25.131115 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.131083 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6gd22" event={"ID":"bf0796a8-757c-48fe-ba20-dbb6e93de179","Type":"ContainerStarted","Data":"778caa931b4e917bb4b6d91a7d4166f35d3e1123c48c88983439363e8d92c2b4"} Apr 17 21:14:25.133870 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.133826 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nqctl" event={"ID":"9b32da7b-3f9e-431b-b417-68d5b307bbd0","Type":"ContainerStarted","Data":"3862f84677e76d8bdf9fe085d212e95f17c765adcba24d7e42509324d11eb126"} Apr 17 21:14:25.135345 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.135283 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" event={"ID":"49db8cb9-f293-4fa9-8608-10aa95283255","Type":"ContainerStarted","Data":"a6460f7536603194666eaf426cbd4e3f18354cb157bb27496223ccfe5ca0df9a"} Apr 17 21:14:25.136448 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.136379 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"bbe0213229860e61bb4d24b2ac86d65338979b971e2069ac850104a51bad0a3d"} Apr 17 21:14:25.139614 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.139590 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kv4pm" event={"ID":"5b17b894-59e3-497a-832e-05720d6d30d8","Type":"ContainerStarted","Data":"f2fc92cc5d7b9b0f633727f10a75465e8768b78950b2f803c0f7a8aede59789a"} Apr 17 21:14:25.148596 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.148551 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6gd22" podStartSLOduration=3.113369434 podStartE2EDuration="20.148535733s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.772056749 +0000 UTC m=+3.341569479" lastFinishedPulling="2026-04-17 21:14:24.807223039 +0000 UTC m=+20.376735778" observedRunningTime="2026-04-17 21:14:25.148150259 +0000 UTC m=+20.717663011" watchObservedRunningTime="2026-04-17 21:14:25.148535733 +0000 UTC m=+20.718048488" Apr 17 21:14:25.148764 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.148733 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-198.ec2.internal" podStartSLOduration=19.148724642 podStartE2EDuration="19.148724642s" podCreationTimestamp="2026-04-17 21:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:14:10.119664129 +0000 UTC m=+5.689176906" watchObservedRunningTime="2026-04-17 21:14:25.148724642 +0000 UTC m=+20.718237396" Apr 17 21:14:25.163587 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.163508 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kv4pm" podStartSLOduration=11.443499458 podStartE2EDuration="20.163492749s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.766750401 +0000 UTC m=+3.336263136" lastFinishedPulling="2026-04-17 21:14:16.48674368 +0000 UTC m=+12.056256427" observedRunningTime="2026-04-17 21:14:25.163186555 +0000 UTC m=+20.732699308" watchObservedRunningTime="2026-04-17 21:14:25.163492749 +0000 UTC m=+20.733005503" Apr 17 21:14:25.178543 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:25.178428 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nqctl" podStartSLOduration=3.127481609 podStartE2EDuration="20.17841433s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.769156829 +0000 UTC m=+3.338669575" lastFinishedPulling="2026-04-17 21:14:24.82008956 +0000 UTC m=+20.389602296" observedRunningTime="2026-04-17 21:14:25.177567639 +0000 UTC m=+20.747080393" watchObservedRunningTime="2026-04-17 21:14:25.17841433 +0000 UTC m=+20.747927083" Apr 17 21:14:26.142485 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.142402 2567 generic.go:358] "Generic (PLEG): container finished" podID="6d1fc6dd-6533-4846-82a8-55fc0feb006f" containerID="e046c6083ebc0ae620d413bb7bf323e1c3f2e4ceb4cbabea8d14b8f7b950b054" exitCode=0 Apr 17 21:14:26.143194 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.142482 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xl898" event={"ID":"6d1fc6dd-6533-4846-82a8-55fc0feb006f","Type":"ContainerDied","Data":"e046c6083ebc0ae620d413bb7bf323e1c3f2e4ceb4cbabea8d14b8f7b950b054"} Apr 17 21:14:26.143753 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.143719 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-n4q7j" event={"ID":"d08d6c80-4c67-4cea-8f12-12b55c526b6d","Type":"ContainerStarted","Data":"34d6ac8feb2d540b43245e4f90f362f0b768dc681f4fb4542aded459fb699f28"} Apr 17 21:14:26.147843 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.147821 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:14:26.148127 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.148110 2567 generic.go:358] "Generic (PLEG): container finished" podID="2dfc2041-df04-460a-9385-f9b334671d62" containerID="fad895a5c98e0ee4e841f0d7d0c96eef212d618aaa8ad1c31f94c060de332f0f" exitCode=1 Apr 17 21:14:26.148209 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.148168 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"80ea26e44d1bed6d9e6a6ec31bf17c0e23ed9bc5ad37c9c3f8153cfd66b7ba83"} Apr 17 21:14:26.148209 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.148187 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"651c2eced7bc24572862976be056c6dd80510f37da6c63d438675f1fb1430846"} Apr 17 21:14:26.148209 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.148201 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"f4a496441eee5004ab6e124a79c14555d9c11b5bb2281238bf7a609703475daa"} Apr 17 21:14:26.148321 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.148210 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"76f2573fc441f64f9a2d881f9c708ecf138d985049d4a2be12fefe6f5255aa14"} Apr 17 21:14:26.148321 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.148222 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerDied","Data":"fad895a5c98e0ee4e841f0d7d0c96eef212d618aaa8ad1c31f94c060de332f0f"} Apr 17 21:14:26.149195 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.149178 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-khzwm" event={"ID":"e067c0b5-668b-46b2-855d-e7cc2d5b9db4","Type":"ContainerStarted","Data":"7718ad7f62a0e586ff463935f331585cd94650732f943cc6ef0ff7d45974b1fe"} Apr 17 21:14:26.176290 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.176252 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-n4q7j" podStartSLOduration=4.139874149 podStartE2EDuration="21.176238163s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.769867285 +0000 UTC m=+3.339380017" lastFinishedPulling="2026-04-17 21:14:24.806231296 +0000 UTC m=+20.375744031" observedRunningTime="2026-04-17 21:14:26.176186325 +0000 UTC m=+21.745699077" watchObservedRunningTime="2026-04-17 21:14:26.176238163 +0000 UTC m=+21.745750915" Apr 17 21:14:26.190180 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.190144 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-khzwm" podStartSLOduration=4.191949161 podStartE2EDuration="21.190131843s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.764448259 +0000 UTC m=+3.333960998" lastFinishedPulling="2026-04-17 21:14:24.76263094 +0000 UTC m=+20.332143680" observedRunningTime="2026-04-17 21:14:26.189598279 +0000 UTC m=+21.759111031" watchObservedRunningTime="2026-04-17 21:14:26.190131843 +0000 UTC m=+21.759644596" Apr 17 21:14:26.511364 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.511332 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:26.512256 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.512232 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:26.589078 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:26.589027 2567 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 21:14:27.012183 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.012003 2567 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T21:14:26.589052337Z","UUID":"c2094d1c-5019-49e1-8d88-bf34e18ce569","Handler":null,"Name":"","Endpoint":""} Apr 17 21:14:27.014963 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.014936 2567 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 21:14:27.014963 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.014966 2567 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 21:14:27.050986 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.050958 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:27.051147 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.050958 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:27.051147 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:27.051082 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:27.051259 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:27.051146 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:27.152563 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.152502 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4bfg5" event={"ID":"162cd45c-08b2-4eec-81a1-8cab6d1ebbeb","Type":"ContainerStarted","Data":"e2e9414f44d78779513d95db4b6412b43a24e36f5c20699a5d6d76c6507b0b58"} Apr 17 21:14:27.154646 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.154619 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" event={"ID":"49db8cb9-f293-4fa9-8608-10aa95283255","Type":"ContainerStarted","Data":"061d777a09ab862e46808eaca34584ed15d13e726d9eb41cd3c14d31990ec7e4"} Apr 17 21:14:27.155097 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.155077 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:27.155677 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.155658 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-n4q7j" Apr 17 21:14:27.166782 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:27.166729 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4bfg5" podStartSLOduration=5.160758862 podStartE2EDuration="22.166713726s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.776899468 +0000 UTC m=+3.346412212" lastFinishedPulling="2026-04-17 21:14:24.78285433 +0000 UTC m=+20.352367076" observedRunningTime="2026-04-17 21:14:27.166258777 +0000 UTC m=+22.735771532" watchObservedRunningTime="2026-04-17 21:14:27.166713726 +0000 UTC m=+22.736226483" Apr 17 21:14:28.159484 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:28.159212 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" event={"ID":"49db8cb9-f293-4fa9-8608-10aa95283255","Type":"ContainerStarted","Data":"82a6236f14205d266afba69f2b460f0723d75d6808823dd77441cb824b1723da"} Apr 17 21:14:28.162497 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:28.162469 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:14:28.162931 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:28.162895 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"6707416001e99f1712fc9a68bc3c9a6c64dd77668b9e666d82b229f2478c10cc"} Apr 17 21:14:28.175066 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:28.174974 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qmjfm" podStartSLOduration=3.040692859 podStartE2EDuration="23.174960116s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.776787436 +0000 UTC m=+3.346300170" lastFinishedPulling="2026-04-17 21:14:27.911054688 +0000 UTC m=+23.480567427" observedRunningTime="2026-04-17 21:14:28.173903321 +0000 UTC m=+23.743416101" watchObservedRunningTime="2026-04-17 21:14:28.174960116 +0000 UTC m=+23.744472869" Apr 17 21:14:29.050857 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:29.050825 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:29.051090 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:29.050952 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:29.051090 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:29.051014 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:29.051214 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:29.051143 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:31.051038 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.050850 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:31.051840 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.050850 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:31.051840 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:31.051128 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:31.051840 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:31.051174 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:31.171197 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.171166 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:14:31.171568 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.171537 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"20587292b5a73d51946fd07863eedc98be44d42e2d33f3ebe4f16c8c9c42bffc"} Apr 17 21:14:31.172006 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.171859 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:31.172006 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.171885 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:31.172006 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.171984 2567 scope.go:117] "RemoveContainer" containerID="fad895a5c98e0ee4e841f0d7d0c96eef212d618aaa8ad1c31f94c060de332f0f" Apr 17 21:14:31.173341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.173317 2567 generic.go:358] "Generic (PLEG): container finished" podID="6d1fc6dd-6533-4846-82a8-55fc0feb006f" containerID="94deaed7c1802ac764c4f542fc801fd13933061779654e8489ca66d1c9c71b92" exitCode=0 Apr 17 21:14:31.173437 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.173370 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xl898" event={"ID":"6d1fc6dd-6533-4846-82a8-55fc0feb006f","Type":"ContainerDied","Data":"94deaed7c1802ac764c4f542fc801fd13933061779654e8489ca66d1c9c71b92"} Apr 17 21:14:31.186909 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.186889 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:31.187107 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:31.187070 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:32.114390 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:32.114358 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ndfzt"] Apr 17 21:14:32.114804 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:32.114483 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:32.114804 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:32.114592 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:32.117205 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:32.117179 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ph67v"] Apr 17 21:14:32.117337 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:32.117298 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:32.117415 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:32.117394 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:32.178612 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:32.178547 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:14:32.178852 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:32.178830 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" event={"ID":"2dfc2041-df04-460a-9385-f9b334671d62","Type":"ContainerStarted","Data":"d71b7b703187a3b5f7a2b804b94dd51f6d09938cfaca2fc670554a853eebc487"} Apr 17 21:14:32.179162 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:32.179141 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:14:32.204153 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:32.204111 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" podStartSLOduration=10.126898261000001 podStartE2EDuration="27.204096401s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.776307119 +0000 UTC m=+3.345819864" lastFinishedPulling="2026-04-17 21:14:24.853505268 +0000 UTC m=+20.423018004" observedRunningTime="2026-04-17 21:14:32.202867753 +0000 UTC m=+27.772380548" watchObservedRunningTime="2026-04-17 21:14:32.204096401 +0000 UTC m=+27.773609131" Apr 17 21:14:33.182663 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:33.182475 2567 generic.go:358] "Generic (PLEG): container finished" podID="6d1fc6dd-6533-4846-82a8-55fc0feb006f" containerID="7b712387794a41aefbe0164f1ba4a9a54e08473c0d8c18bbd3f42d0662fe2e06" exitCode=0 Apr 17 21:14:33.182993 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:33.182534 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xl898" event={"ID":"6d1fc6dd-6533-4846-82a8-55fc0feb006f","Type":"ContainerDied","Data":"7b712387794a41aefbe0164f1ba4a9a54e08473c0d8c18bbd3f42d0662fe2e06"} Apr 17 21:14:34.050361 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:34.050328 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:34.050504 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:34.050328 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:34.050504 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:34.050429 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:34.050591 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:34.050500 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:35.187835 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:35.187796 2567 generic.go:358] "Generic (PLEG): container finished" podID="6d1fc6dd-6533-4846-82a8-55fc0feb006f" containerID="ba5323b00ba08cfe35ac8ec5083fc952bfa062de40c806a0fe7c7a2c36c5c801" exitCode=0 Apr 17 21:14:35.188341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:35.187868 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xl898" event={"ID":"6d1fc6dd-6533-4846-82a8-55fc0feb006f","Type":"ContainerDied","Data":"ba5323b00ba08cfe35ac8ec5083fc952bfa062de40c806a0fe7c7a2c36c5c801"} Apr 17 21:14:36.050771 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:36.050736 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:36.050932 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:36.050736 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:36.050932 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:36.050888 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:36.050932 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:36.050922 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:38.050396 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:38.050357 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:38.050396 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:38.050378 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:38.050938 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:38.050479 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ndfzt" podUID="95290018-54bc-46b1-8b24-b0bae6086a51" Apr 17 21:14:38.050938 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:38.050666 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ph67v" podUID="efa382f4-9974-4993-ae95-7a0c981d06ab" Apr 17 21:14:38.780061 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:38.780029 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:38.780238 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:38.780177 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:38.780301 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:38.780251 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs podName:95290018-54bc-46b1-8b24-b0bae6086a51 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:10.780231036 +0000 UTC m=+66.349743772 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs") pod "network-metrics-daemon-ndfzt" (UID: "95290018-54bc-46b1-8b24-b0bae6086a51") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:14:38.881352 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:38.881315 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:38.881574 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:38.881443 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:14:38.881574 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:38.881462 2567 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:14:38.881574 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:38.881471 2567 projected.go:194] Error preparing data for projected volume kube-api-access-zh2mv for pod openshift-network-diagnostics/network-check-target-ph67v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:38.881574 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:38.881541 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv podName:efa382f4-9974-4993-ae95-7a0c981d06ab nodeName:}" failed. No retries permitted until 2026-04-17 21:15:10.881509387 +0000 UTC m=+66.451022123 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zh2mv" (UniqueName: "kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv") pod "network-check-target-ph67v" (UID: "efa382f4-9974-4993-ae95-7a0c981d06ab") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:14:39.280350 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.280320 2567 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-198.ec2.internal" event="NodeReady" Apr 17 21:14:39.280814 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.280443 2567 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 21:14:39.319271 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.319232 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4d6rl"] Apr 17 21:14:39.343977 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.343948 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-l25qj"] Apr 17 21:14:39.344140 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.344010 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.346751 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.346728 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 21:14:39.346879 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.346729 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-k464c\"" Apr 17 21:14:39.347073 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.347057 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 21:14:39.368371 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.368338 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4d6rl"] Apr 17 21:14:39.368371 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.368362 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:39.368371 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.368376 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l25qj"] Apr 17 21:14:39.370806 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.370781 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 21:14:39.370945 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.370832 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 21:14:39.370945 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.370853 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ggwsl\"" Apr 17 21:14:39.370945 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.370783 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 21:14:39.485409 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.485358 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49118387-7ece-4934-bcd2-c3a2447f3933-config-volume\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.485603 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.485421 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m64x2\" (UniqueName: \"kubernetes.io/projected/49118387-7ece-4934-bcd2-c3a2447f3933-kube-api-access-m64x2\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.485603 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.485589 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:39.485720 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.485626 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brtj4\" (UniqueName: \"kubernetes.io/projected/46806929-68d7-4f2b-a1a4-39799c177ba4-kube-api-access-brtj4\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:39.485720 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.485668 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49118387-7ece-4934-bcd2-c3a2447f3933-tmp-dir\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.485720 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.485696 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.586801 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.586704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:39.586801 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.586754 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brtj4\" (UniqueName: \"kubernetes.io/projected/46806929-68d7-4f2b-a1a4-39799c177ba4-kube-api-access-brtj4\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:39.586801 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.586785 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49118387-7ece-4934-bcd2-c3a2447f3933-tmp-dir\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.587073 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.586810 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.587073 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.586850 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49118387-7ece-4934-bcd2-c3a2447f3933-config-volume\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.587073 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:39.586863 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:39.587073 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.586879 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m64x2\" (UniqueName: \"kubernetes.io/projected/49118387-7ece-4934-bcd2-c3a2447f3933-kube-api-access-m64x2\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.587073 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:39.586929 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert podName:46806929-68d7-4f2b-a1a4-39799c177ba4 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:40.086909964 +0000 UTC m=+35.656422703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert") pod "ingress-canary-l25qj" (UID: "46806929-68d7-4f2b-a1a4-39799c177ba4") : secret "canary-serving-cert" not found Apr 17 21:14:39.587073 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:39.586989 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:39.587073 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:39.587055 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls podName:49118387-7ece-4934-bcd2-c3a2447f3933 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:40.087038848 +0000 UTC m=+35.656551578 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls") pod "dns-default-4d6rl" (UID: "49118387-7ece-4934-bcd2-c3a2447f3933") : secret "dns-default-metrics-tls" not found Apr 17 21:14:39.587395 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.587354 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49118387-7ece-4934-bcd2-c3a2447f3933-tmp-dir\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.587545 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.587501 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49118387-7ece-4934-bcd2-c3a2447f3933-config-volume\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.600295 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.600266 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m64x2\" (UniqueName: \"kubernetes.io/projected/49118387-7ece-4934-bcd2-c3a2447f3933-kube-api-access-m64x2\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:39.600427 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:39.600308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brtj4\" (UniqueName: \"kubernetes.io/projected/46806929-68d7-4f2b-a1a4-39799c177ba4-kube-api-access-brtj4\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:40.051041 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.051001 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:14:40.051232 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.051001 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:14:40.053823 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.053798 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 21:14:40.053947 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.053863 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 21:14:40.054920 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.054895 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zzp4n\"" Apr 17 21:14:40.054920 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.054921 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgbn6\"" Apr 17 21:14:40.055099 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.054896 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 21:14:40.090093 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.090066 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:40.090232 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:40.090116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:40.090232 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:40.090221 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:40.090340 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:40.090246 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:40.090340 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:40.090290 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert podName:46806929-68d7-4f2b-a1a4-39799c177ba4 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:41.09027051 +0000 UTC m=+36.659783254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert") pod "ingress-canary-l25qj" (UID: "46806929-68d7-4f2b-a1a4-39799c177ba4") : secret "canary-serving-cert" not found Apr 17 21:14:40.090340 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:40.090312 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls podName:49118387-7ece-4934-bcd2-c3a2447f3933 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:41.090300113 +0000 UTC m=+36.659812849 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls") pod "dns-default-4d6rl" (UID: "49118387-7ece-4934-bcd2-c3a2447f3933") : secret "dns-default-metrics-tls" not found Apr 17 21:14:41.096687 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:41.096648 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:41.097279 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:41.096741 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:41.097279 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:41.096837 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:41.097279 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:41.096894 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:41.097279 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:41.096921 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert podName:46806929-68d7-4f2b-a1a4-39799c177ba4 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:43.096900635 +0000 UTC m=+38.666413379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert") pod "ingress-canary-l25qj" (UID: "46806929-68d7-4f2b-a1a4-39799c177ba4") : secret "canary-serving-cert" not found Apr 17 21:14:41.097279 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:41.096952 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls podName:49118387-7ece-4934-bcd2-c3a2447f3933 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:43.096933467 +0000 UTC m=+38.666446203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls") pod "dns-default-4d6rl" (UID: "49118387-7ece-4934-bcd2-c3a2447f3933") : secret "dns-default-metrics-tls" not found Apr 17 21:14:42.203263 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:42.203075 2567 generic.go:358] "Generic (PLEG): container finished" podID="6d1fc6dd-6533-4846-82a8-55fc0feb006f" containerID="069213f730d89c8d4c0a31f7f0710dab652466c7c71c51cfd5ce3f1695aef168" exitCode=0 Apr 17 21:14:42.203720 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:42.203161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xl898" event={"ID":"6d1fc6dd-6533-4846-82a8-55fc0feb006f","Type":"ContainerDied","Data":"069213f730d89c8d4c0a31f7f0710dab652466c7c71c51cfd5ce3f1695aef168"} Apr 17 21:14:43.111415 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:43.111379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:43.111415 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:43.111417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:43.111648 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:43.111533 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:43.111648 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:43.111537 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:43.111648 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:43.111591 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert podName:46806929-68d7-4f2b-a1a4-39799c177ba4 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:47.111575026 +0000 UTC m=+42.681087757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert") pod "ingress-canary-l25qj" (UID: "46806929-68d7-4f2b-a1a4-39799c177ba4") : secret "canary-serving-cert" not found Apr 17 21:14:43.111648 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:43.111609 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls podName:49118387-7ece-4934-bcd2-c3a2447f3933 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:47.111602433 +0000 UTC m=+42.681115164 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls") pod "dns-default-4d6rl" (UID: "49118387-7ece-4934-bcd2-c3a2447f3933") : secret "dns-default-metrics-tls" not found Apr 17 21:14:43.208015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:43.207983 2567 generic.go:358] "Generic (PLEG): container finished" podID="6d1fc6dd-6533-4846-82a8-55fc0feb006f" containerID="76673ee23216c6726fe012cd52a6df7c033222fab836edda1b35c43ffee44ace" exitCode=0 Apr 17 21:14:43.208342 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:43.208030 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xl898" event={"ID":"6d1fc6dd-6533-4846-82a8-55fc0feb006f","Type":"ContainerDied","Data":"76673ee23216c6726fe012cd52a6df7c033222fab836edda1b35c43ffee44ace"} Apr 17 21:14:44.212964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:44.212928 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xl898" event={"ID":"6d1fc6dd-6533-4846-82a8-55fc0feb006f","Type":"ContainerStarted","Data":"d4d2a81f80f404bfe98653a910eae35e00595a7b7a28c381eb2a194617bcc6bf"} Apr 17 21:14:44.235893 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:44.235847 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xl898" podStartSLOduration=5.451798217 podStartE2EDuration="39.235830926s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:14:07.775339847 +0000 UTC m=+3.344852591" lastFinishedPulling="2026-04-17 21:14:41.559372563 +0000 UTC m=+37.128885300" observedRunningTime="2026-04-17 21:14:44.235300763 +0000 UTC m=+39.804813543" watchObservedRunningTime="2026-04-17 21:14:44.235830926 +0000 UTC m=+39.805343679" Apr 17 21:14:47.141535 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:47.141485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:47.141910 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:47.141545 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:47.141910 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:47.141636 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:47.141910 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:47.141702 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert podName:46806929-68d7-4f2b-a1a4-39799c177ba4 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:55.141687198 +0000 UTC m=+50.711199934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert") pod "ingress-canary-l25qj" (UID: "46806929-68d7-4f2b-a1a4-39799c177ba4") : secret "canary-serving-cert" not found Apr 17 21:14:47.141910 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:47.141642 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:47.141910 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:47.141786 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls podName:49118387-7ece-4934-bcd2-c3a2447f3933 nodeName:}" failed. No retries permitted until 2026-04-17 21:14:55.141774308 +0000 UTC m=+50.711287038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls") pod "dns-default-4d6rl" (UID: "49118387-7ece-4934-bcd2-c3a2447f3933") : secret "dns-default-metrics-tls" not found Apr 17 21:14:55.200873 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:55.200831 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:14:55.201372 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:14:55.200925 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:14:55.201372 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:55.200989 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:14:55.201372 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:55.201009 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:14:55.201372 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:55.201060 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert podName:46806929-68d7-4f2b-a1a4-39799c177ba4 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:11.201044608 +0000 UTC m=+66.770557340 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert") pod "ingress-canary-l25qj" (UID: "46806929-68d7-4f2b-a1a4-39799c177ba4") : secret "canary-serving-cert" not found Apr 17 21:14:55.201372 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:14:55.201078 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls podName:49118387-7ece-4934-bcd2-c3a2447f3933 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:11.201067924 +0000 UTC m=+66.770580659 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls") pod "dns-default-4d6rl" (UID: "49118387-7ece-4934-bcd2-c3a2447f3933") : secret "dns-default-metrics-tls" not found Apr 17 21:15:03.199175 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:03.199149 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hn2f5" Apr 17 21:15:10.800264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:10.800212 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:15:10.802996 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:10.802975 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 21:15:10.811023 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:10.810997 2567 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 21:15:10.811107 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:10.811077 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs podName:95290018-54bc-46b1-8b24-b0bae6086a51 nodeName:}" failed. No retries permitted until 2026-04-17 21:16:14.811056996 +0000 UTC m=+130.380569730 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs") pod "network-metrics-daemon-ndfzt" (UID: "95290018-54bc-46b1-8b24-b0bae6086a51") : secret "metrics-daemon-secret" not found Apr 17 21:15:10.901105 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:10.901071 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:15:10.903931 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:10.903913 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 21:15:10.914068 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:10.914046 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 21:15:10.925094 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:10.925066 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh2mv\" (UniqueName: \"kubernetes.io/projected/efa382f4-9974-4993-ae95-7a0c981d06ab-kube-api-access-zh2mv\") pod \"network-check-target-ph67v\" (UID: \"efa382f4-9974-4993-ae95-7a0c981d06ab\") " pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:15:10.972258 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:10.972223 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgbn6\"" Apr 17 21:15:10.980212 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:10.980190 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:15:11.103342 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:11.103310 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ph67v"] Apr 17 21:15:11.107215 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:15:11.107187 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefa382f4_9974_4993_ae95_7a0c981d06ab.slice/crio-f7db996f28e14f63f153c4874fedf8d1b63c5cc5b63d8fd2fb9663df476f1788 WatchSource:0}: Error finding container f7db996f28e14f63f153c4874fedf8d1b63c5cc5b63d8fd2fb9663df476f1788: Status 404 returned error can't find the container with id f7db996f28e14f63f153c4874fedf8d1b63c5cc5b63d8fd2fb9663df476f1788 Apr 17 21:15:11.203464 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:11.203429 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:15:11.203464 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:11.203474 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:15:11.203741 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:11.203611 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:15:11.203741 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:11.203692 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert podName:46806929-68d7-4f2b-a1a4-39799c177ba4 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:43.203670012 +0000 UTC m=+98.773182759 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert") pod "ingress-canary-l25qj" (UID: "46806929-68d7-4f2b-a1a4-39799c177ba4") : secret "canary-serving-cert" not found Apr 17 21:15:11.203741 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:11.203612 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:15:11.203741 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:11.203744 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls podName:49118387-7ece-4934-bcd2-c3a2447f3933 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:43.203732275 +0000 UTC m=+98.773245005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls") pod "dns-default-4d6rl" (UID: "49118387-7ece-4934-bcd2-c3a2447f3933") : secret "dns-default-metrics-tls" not found Apr 17 21:15:11.262642 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:11.262612 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ph67v" event={"ID":"efa382f4-9974-4993-ae95-7a0c981d06ab","Type":"ContainerStarted","Data":"f7db996f28e14f63f153c4874fedf8d1b63c5cc5b63d8fd2fb9663df476f1788"} Apr 17 21:15:14.270396 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:14.270361 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ph67v" event={"ID":"efa382f4-9974-4993-ae95-7a0c981d06ab","Type":"ContainerStarted","Data":"5a6aa20c3f5aef09b74852d039b7fffb30899a855af708f3f3fb9ed50f1b3e88"} Apr 17 21:15:14.270852 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:14.270488 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:15:14.285468 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:14.285422 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ph67v" podStartSLOduration=66.620570384 podStartE2EDuration="1m9.285408012s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:15:11.108865555 +0000 UTC m=+66.678378300" lastFinishedPulling="2026-04-17 21:15:13.773703196 +0000 UTC m=+69.343215928" observedRunningTime="2026-04-17 21:15:14.28444001 +0000 UTC m=+69.853952777" watchObservedRunningTime="2026-04-17 21:15:14.285408012 +0000 UTC m=+69.854920764" Apr 17 21:15:30.765389 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.765265 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7"] Apr 17 21:15:30.769391 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.769370 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:30.773021 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.772996 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 21:15:30.773021 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.773012 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 21:15:30.774316 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.774293 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 21:15:30.774455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.774400 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cjnxb\"" Apr 17 21:15:30.774538 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.774488 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 21:15:30.778679 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.778659 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7"] Apr 17 21:15:30.838428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.838387 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz5tr\" (UniqueName: \"kubernetes.io/projected/699d9724-833b-4266-b2d6-0ae0369b1d91-kube-api-access-qz5tr\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:30.838428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.838431 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/699d9724-833b-4266-b2d6-0ae0369b1d91-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:30.838668 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.838455 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:30.938975 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.938944 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qz5tr\" (UniqueName: \"kubernetes.io/projected/699d9724-833b-4266-b2d6-0ae0369b1d91-kube-api-access-qz5tr\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:30.939066 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.938982 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/699d9724-833b-4266-b2d6-0ae0369b1d91-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:30.939066 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.939007 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:30.939139 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:30.939119 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:30.939185 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:30.939175 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls podName:699d9724-833b-4266-b2d6-0ae0369b1d91 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:31.439161479 +0000 UTC m=+87.008674214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7wmg7" (UID: "699d9724-833b-4266-b2d6-0ae0369b1d91") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:30.939675 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.939656 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/699d9724-833b-4266-b2d6-0ae0369b1d91-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:30.947092 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:30.947075 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz5tr\" (UniqueName: \"kubernetes.io/projected/699d9724-833b-4266-b2d6-0ae0369b1d91-kube-api-access-qz5tr\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:31.443387 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:31.443352 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:31.443569 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:31.443512 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:31.443615 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:31.443611 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls podName:699d9724-833b-4266-b2d6-0ae0369b1d91 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:32.443590437 +0000 UTC m=+88.013103172 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7wmg7" (UID: "699d9724-833b-4266-b2d6-0ae0369b1d91") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:32.452320 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:32.452255 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:32.452845 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:32.452582 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:32.452845 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:32.452675 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls podName:699d9724-833b-4266-b2d6-0ae0369b1d91 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:34.452653136 +0000 UTC m=+90.022165889 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7wmg7" (UID: "699d9724-833b-4266-b2d6-0ae0369b1d91") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:33.633884 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.633847 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4"] Apr 17 21:15:33.636853 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.636837 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4" Apr 17 21:15:33.639173 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.639145 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-9jl5l\"" Apr 17 21:15:33.639291 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.639151 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 21:15:33.640048 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.640034 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:15:33.645577 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.645553 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4"] Apr 17 21:15:33.734483 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.734440 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx"] Apr 17 21:15:33.737267 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.737248 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:33.739650 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.739625 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 21:15:33.739758 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.739667 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-96xvk\"" Apr 17 21:15:33.739815 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.739768 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:15:33.739997 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.739983 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 21:15:33.744743 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.744719 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx"] Apr 17 21:15:33.763532 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.763482 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svxrk\" (UniqueName: \"kubernetes.io/projected/c2b61a7e-5701-47f5-9c33-420684fa3f8d-kube-api-access-svxrk\") pod \"volume-data-source-validator-7c6cbb6c87-8f5t4\" (UID: \"c2b61a7e-5701-47f5-9c33-420684fa3f8d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4" Apr 17 21:15:33.864912 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.864876 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svxrk\" (UniqueName: \"kubernetes.io/projected/c2b61a7e-5701-47f5-9c33-420684fa3f8d-kube-api-access-svxrk\") pod \"volume-data-source-validator-7c6cbb6c87-8f5t4\" (UID: \"c2b61a7e-5701-47f5-9c33-420684fa3f8d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4" Apr 17 21:15:33.865091 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.864928 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:33.865091 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.864964 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t7bx\" (UniqueName: \"kubernetes.io/projected/1499bff6-6da6-4cac-bef0-0e6307a5810f-kube-api-access-7t7bx\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:33.873011 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.872985 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svxrk\" (UniqueName: \"kubernetes.io/projected/c2b61a7e-5701-47f5-9c33-420684fa3f8d-kube-api-access-svxrk\") pod \"volume-data-source-validator-7c6cbb6c87-8f5t4\" (UID: \"c2b61a7e-5701-47f5-9c33-420684fa3f8d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4" Apr 17 21:15:33.946179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.946073 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4" Apr 17 21:15:33.966230 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.966197 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:33.966382 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.966245 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t7bx\" (UniqueName: \"kubernetes.io/projected/1499bff6-6da6-4cac-bef0-0e6307a5810f-kube-api-access-7t7bx\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:33.966451 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:33.966373 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:15:33.966504 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:33.966468 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls podName:1499bff6-6da6-4cac-bef0-0e6307a5810f nodeName:}" failed. No retries permitted until 2026-04-17 21:15:34.466444909 +0000 UTC m=+90.035957657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8lmdx" (UID: "1499bff6-6da6-4cac-bef0-0e6307a5810f") : secret "samples-operator-tls" not found Apr 17 21:15:33.976634 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:33.976607 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t7bx\" (UniqueName: \"kubernetes.io/projected/1499bff6-6da6-4cac-bef0-0e6307a5810f-kube-api-access-7t7bx\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:34.064616 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.064583 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4"] Apr 17 21:15:34.067557 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:15:34.067528 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b61a7e_5701_47f5_9c33_420684fa3f8d.slice/crio-bfadde72c79387ea090e4dad31ec6db257feef0c34b25d89fdfc5ab36d5091e1 WatchSource:0}: Error finding container bfadde72c79387ea090e4dad31ec6db257feef0c34b25d89fdfc5ab36d5091e1: Status 404 returned error can't find the container with id bfadde72c79387ea090e4dad31ec6db257feef0c34b25d89fdfc5ab36d5091e1 Apr 17 21:15:34.305984 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.305908 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4" event={"ID":"c2b61a7e-5701-47f5-9c33-420684fa3f8d","Type":"ContainerStarted","Data":"bfadde72c79387ea090e4dad31ec6db257feef0c34b25d89fdfc5ab36d5091e1"} Apr 17 21:15:34.469384 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.469349 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:34.469547 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.469438 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:34.469547 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:34.469514 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:34.469636 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:34.469594 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls podName:699d9724-833b-4266-b2d6-0ae0369b1d91 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:38.46957768 +0000 UTC m=+94.039090416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7wmg7" (UID: "699d9724-833b-4266-b2d6-0ae0369b1d91") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:34.469636 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:34.469541 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:15:34.469707 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:34.469666 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls podName:1499bff6-6da6-4cac-bef0-0e6307a5810f nodeName:}" failed. No retries permitted until 2026-04-17 21:15:35.469653807 +0000 UTC m=+91.039166542 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8lmdx" (UID: "1499bff6-6da6-4cac-bef0-0e6307a5810f") : secret "samples-operator-tls" not found Apr 17 21:15:34.633102 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.633068 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2d498"] Apr 17 21:15:34.637153 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.637135 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.639809 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.639782 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:15:34.639923 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.639816 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-876hd\"" Apr 17 21:15:34.639923 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.639847 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 21:15:34.639923 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.639845 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 21:15:34.640188 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.640167 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 21:15:34.648198 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.648172 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 21:15:34.650563 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.650537 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2d498"] Apr 17 21:15:34.671383 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.671348 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d6b966-299f-473e-b704-4ce1b867b0b5-config\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.671383 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.671385 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7d6b966-299f-473e-b704-4ce1b867b0b5-trusted-ca\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.671590 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.671411 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d6b966-299f-473e-b704-4ce1b867b0b5-serving-cert\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.671590 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.671533 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm7mc\" (UniqueName: \"kubernetes.io/projected/c7d6b966-299f-473e-b704-4ce1b867b0b5-kube-api-access-hm7mc\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.772867 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.772816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d6b966-299f-473e-b704-4ce1b867b0b5-config\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.772867 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.772863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7d6b966-299f-473e-b704-4ce1b867b0b5-trusted-ca\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.773100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.772885 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d6b966-299f-473e-b704-4ce1b867b0b5-serving-cert\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.773100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.773073 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hm7mc\" (UniqueName: \"kubernetes.io/projected/c7d6b966-299f-473e-b704-4ce1b867b0b5-kube-api-access-hm7mc\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.773650 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.773615 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d6b966-299f-473e-b704-4ce1b867b0b5-config\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.773788 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.773724 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7d6b966-299f-473e-b704-4ce1b867b0b5-trusted-ca\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.775697 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.775676 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7d6b966-299f-473e-b704-4ce1b867b0b5-serving-cert\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.780854 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.780834 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm7mc\" (UniqueName: \"kubernetes.io/projected/c7d6b966-299f-473e-b704-4ce1b867b0b5-kube-api-access-hm7mc\") pod \"console-operator-9d4b6777b-2d498\" (UID: \"c7d6b966-299f-473e-b704-4ce1b867b0b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:34.948143 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:34.948066 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:35.076433 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:35.076405 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2d498"] Apr 17 21:15:35.079479 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:15:35.079448 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d6b966_299f_473e_b704_4ce1b867b0b5.slice/crio-d4a141674293aeff7687cf7c2f091c3e0c549803fa83f9d17184557a8ded2256 WatchSource:0}: Error finding container d4a141674293aeff7687cf7c2f091c3e0c549803fa83f9d17184557a8ded2256: Status 404 returned error can't find the container with id d4a141674293aeff7687cf7c2f091c3e0c549803fa83f9d17184557a8ded2256 Apr 17 21:15:35.309105 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:35.309021 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" event={"ID":"c7d6b966-299f-473e-b704-4ce1b867b0b5","Type":"ContainerStarted","Data":"d4a141674293aeff7687cf7c2f091c3e0c549803fa83f9d17184557a8ded2256"} Apr 17 21:15:35.479452 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:35.479414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:35.479647 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:35.479609 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:15:35.479705 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:35.479692 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls podName:1499bff6-6da6-4cac-bef0-0e6307a5810f nodeName:}" failed. No retries permitted until 2026-04-17 21:15:37.47967056 +0000 UTC m=+93.049183295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8lmdx" (UID: "1499bff6-6da6-4cac-bef0-0e6307a5810f") : secret "samples-operator-tls" not found Apr 17 21:15:36.006035 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.005956 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-khzwm_e067c0b5-668b-46b2-855d-e7cc2d5b9db4/dns-node-resolver/0.log" Apr 17 21:15:36.011930 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.011894 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56876bf7b7-b22j9"] Apr 17 21:15:36.014885 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.014863 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.017557 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.017418 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 21:15:36.017557 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.017470 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wxtwb\"" Apr 17 21:15:36.017557 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.017539 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 21:15:36.017786 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.017609 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 21:15:36.023842 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.023816 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 21:15:36.025718 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.025696 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56876bf7b7-b22j9"] Apr 17 21:15:36.083535 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.083483 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcnj6\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-kube-api-access-qcnj6\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.083535 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.083538 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-bound-sa-token\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.083775 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.083634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61101b23-f011-4fe9-ab1b-ae44f83ebb50-ca-trust-extracted\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.083775 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.083694 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-image-registry-private-configuration\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.083775 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.083738 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.083775 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.083768 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-installation-pull-secrets\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.083901 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.083815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-certificates\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.083901 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.083832 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-trusted-ca\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.184327 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.184282 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-installation-pull-secrets\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.184497 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.184450 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-certificates\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.184497 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.184486 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-trusted-ca\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.184649 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.184532 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcnj6\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-kube-api-access-qcnj6\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.184649 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.184561 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-bound-sa-token\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.184649 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.184643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61101b23-f011-4fe9-ab1b-ae44f83ebb50-ca-trust-extracted\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.184998 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.184966 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-image-registry-private-configuration\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.185128 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.185038 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.185179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.185137 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-certificates\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.185179 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:36.185165 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:15:36.185241 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:36.185179 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56876bf7b7-b22j9: secret "image-registry-tls" not found Apr 17 21:15:36.185241 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:36.185229 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls podName:61101b23-f011-4fe9-ab1b-ae44f83ebb50 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:36.685213152 +0000 UTC m=+92.254725894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls") pod "image-registry-56876bf7b7-b22j9" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50") : secret "image-registry-tls" not found Apr 17 21:15:36.185618 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.185575 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61101b23-f011-4fe9-ab1b-ae44f83ebb50-ca-trust-extracted\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.185742 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.185721 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-trusted-ca\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.187161 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.187138 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-installation-pull-secrets\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.189689 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.189669 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-image-registry-private-configuration\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.195809 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.195790 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-bound-sa-token\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.196073 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.196053 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcnj6\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-kube-api-access-qcnj6\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.313237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.313149 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4" event={"ID":"c2b61a7e-5701-47f5-9c33-420684fa3f8d","Type":"ContainerStarted","Data":"da5f9579cd717b00d6ce6baffdb8e158441511ab806e265a99db4cb4c208acbd"} Apr 17 21:15:36.327189 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.327144 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-8f5t4" podStartSLOduration=1.722307159 podStartE2EDuration="3.327126574s" podCreationTimestamp="2026-04-17 21:15:33 +0000 UTC" firstStartedPulling="2026-04-17 21:15:34.069387114 +0000 UTC m=+89.638899860" lastFinishedPulling="2026-04-17 21:15:35.674206541 +0000 UTC m=+91.243719275" observedRunningTime="2026-04-17 21:15:36.326451926 +0000 UTC m=+91.895964693" watchObservedRunningTime="2026-04-17 21:15:36.327126574 +0000 UTC m=+91.896639328" Apr 17 21:15:36.690627 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.690581 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:36.690800 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:36.690723 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:15:36.690800 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:36.690740 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56876bf7b7-b22j9: secret "image-registry-tls" not found Apr 17 21:15:36.690921 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:36.690805 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls podName:61101b23-f011-4fe9-ab1b-ae44f83ebb50 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:37.690785179 +0000 UTC m=+93.260297923 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls") pod "image-registry-56876bf7b7-b22j9" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50") : secret "image-registry-tls" not found Apr 17 21:15:36.804207 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:36.804174 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kv4pm_5b17b894-59e3-497a-832e-05720d6d30d8/node-ca/0.log" Apr 17 21:15:37.316344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:37.316264 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/0.log" Apr 17 21:15:37.316344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:37.316307 2567 generic.go:358] "Generic (PLEG): container finished" podID="c7d6b966-299f-473e-b704-4ce1b867b0b5" containerID="504c7a4038a05729732ff78990eb944ec0450158745c52982715e989600e1aa1" exitCode=255 Apr 17 21:15:37.316760 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:37.316396 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" event={"ID":"c7d6b966-299f-473e-b704-4ce1b867b0b5","Type":"ContainerDied","Data":"504c7a4038a05729732ff78990eb944ec0450158745c52982715e989600e1aa1"} Apr 17 21:15:37.316760 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:37.316604 2567 scope.go:117] "RemoveContainer" containerID="504c7a4038a05729732ff78990eb944ec0450158745c52982715e989600e1aa1" Apr 17 21:15:37.497970 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:37.497929 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:37.498128 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:37.498075 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:15:37.498167 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:37.498141 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls podName:1499bff6-6da6-4cac-bef0-0e6307a5810f nodeName:}" failed. No retries permitted until 2026-04-17 21:15:41.498126979 +0000 UTC m=+97.067639709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8lmdx" (UID: "1499bff6-6da6-4cac-bef0-0e6307a5810f") : secret "samples-operator-tls" not found Apr 17 21:15:37.699154 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:37.699116 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:37.699300 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:37.699259 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:15:37.699300 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:37.699277 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56876bf7b7-b22j9: secret "image-registry-tls" not found Apr 17 21:15:37.699390 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:37.699328 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls podName:61101b23-f011-4fe9-ab1b-ae44f83ebb50 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:39.6993134 +0000 UTC m=+95.268826131 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls") pod "image-registry-56876bf7b7-b22j9" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50") : secret "image-registry-tls" not found Apr 17 21:15:38.319399 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:38.319373 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/1.log" Apr 17 21:15:38.319779 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:38.319768 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/0.log" Apr 17 21:15:38.319819 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:38.319800 2567 generic.go:358] "Generic (PLEG): container finished" podID="c7d6b966-299f-473e-b704-4ce1b867b0b5" containerID="a0d640ec347da730095f1b75f0b1f68d26ff647db2c092b9ae76c0d19656e4df" exitCode=255 Apr 17 21:15:38.319852 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:38.319827 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" event={"ID":"c7d6b966-299f-473e-b704-4ce1b867b0b5","Type":"ContainerDied","Data":"a0d640ec347da730095f1b75f0b1f68d26ff647db2c092b9ae76c0d19656e4df"} Apr 17 21:15:38.319885 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:38.319855 2567 scope.go:117] "RemoveContainer" containerID="504c7a4038a05729732ff78990eb944ec0450158745c52982715e989600e1aa1" Apr 17 21:15:38.320105 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:38.320084 2567 scope.go:117] "RemoveContainer" containerID="a0d640ec347da730095f1b75f0b1f68d26ff647db2c092b9ae76c0d19656e4df" Apr 17 21:15:38.320297 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:38.320274 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2d498_openshift-console-operator(c7d6b966-299f-473e-b704-4ce1b867b0b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" podUID="c7d6b966-299f-473e-b704-4ce1b867b0b5" Apr 17 21:15:38.507172 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:38.507120 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:38.507353 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:38.507274 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:38.507353 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:38.507338 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls podName:699d9724-833b-4266-b2d6-0ae0369b1d91 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:46.507323068 +0000 UTC m=+102.076835800 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7wmg7" (UID: "699d9724-833b-4266-b2d6-0ae0369b1d91") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:39.323008 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:39.322976 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/1.log" Apr 17 21:15:39.323351 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:39.323286 2567 scope.go:117] "RemoveContainer" containerID="a0d640ec347da730095f1b75f0b1f68d26ff647db2c092b9ae76c0d19656e4df" Apr 17 21:15:39.323454 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:39.323436 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2d498_openshift-console-operator(c7d6b966-299f-473e-b704-4ce1b867b0b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" podUID="c7d6b966-299f-473e-b704-4ce1b867b0b5" Apr 17 21:15:39.716689 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:39.716643 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:39.716847 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:39.716763 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:15:39.716847 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:39.716782 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56876bf7b7-b22j9: secret "image-registry-tls" not found Apr 17 21:15:39.716847 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:39.716838 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls podName:61101b23-f011-4fe9-ab1b-ae44f83ebb50 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:43.716822916 +0000 UTC m=+99.286335652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls") pod "image-registry-56876bf7b7-b22j9" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50") : secret "image-registry-tls" not found Apr 17 21:15:41.530407 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:41.530353 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:41.530815 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:41.530553 2567 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:15:41.530815 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:41.530629 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls podName:1499bff6-6da6-4cac-bef0-0e6307a5810f nodeName:}" failed. No retries permitted until 2026-04-17 21:15:49.530611382 +0000 UTC m=+105.100124113 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8lmdx" (UID: "1499bff6-6da6-4cac-bef0-0e6307a5810f") : secret "samples-operator-tls" not found Apr 17 21:15:42.125883 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.125847 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw"] Apr 17 21:15:42.129807 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.129783 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" Apr 17 21:15:42.132445 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.132425 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 21:15:42.132624 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.132425 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 21:15:42.133580 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.133561 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-d769t\"" Apr 17 21:15:42.135114 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.135094 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw"] Apr 17 21:15:42.237350 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.237305 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsb57\" (UniqueName: \"kubernetes.io/projected/e2a830a3-2453-4387-872f-788fbca4b588-kube-api-access-dsb57\") pod \"migrator-74bb7799d9-w92gw\" (UID: \"e2a830a3-2453-4387-872f-788fbca4b588\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" Apr 17 21:15:42.337993 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.337957 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsb57\" (UniqueName: \"kubernetes.io/projected/e2a830a3-2453-4387-872f-788fbca4b588-kube-api-access-dsb57\") pod \"migrator-74bb7799d9-w92gw\" (UID: \"e2a830a3-2453-4387-872f-788fbca4b588\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" Apr 17 21:15:42.346008 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.345977 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsb57\" (UniqueName: \"kubernetes.io/projected/e2a830a3-2453-4387-872f-788fbca4b588-kube-api-access-dsb57\") pod \"migrator-74bb7799d9-w92gw\" (UID: \"e2a830a3-2453-4387-872f-788fbca4b588\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" Apr 17 21:15:42.438408 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.438321 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" Apr 17 21:15:42.552397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:42.552365 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw"] Apr 17 21:15:42.555664 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:15:42.555638 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2a830a3_2453_4387_872f_788fbca4b588.slice/crio-9502b7dfc9d7afba780b19cb574769a204ec77b8099e2f0ff724fc52fccf5cb7 WatchSource:0}: Error finding container 9502b7dfc9d7afba780b19cb574769a204ec77b8099e2f0ff724fc52fccf5cb7: Status 404 returned error can't find the container with id 9502b7dfc9d7afba780b19cb574769a204ec77b8099e2f0ff724fc52fccf5cb7 Apr 17 21:15:43.243914 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:43.243880 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:15:43.243914 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:43.243924 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:15:43.244152 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:43.244056 2567 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:15:43.244152 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:43.244073 2567 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:15:43.244152 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:43.244135 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert podName:46806929-68d7-4f2b-a1a4-39799c177ba4 nodeName:}" failed. No retries permitted until 2026-04-17 21:16:47.244113868 +0000 UTC m=+162.813626608 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert") pod "ingress-canary-l25qj" (UID: "46806929-68d7-4f2b-a1a4-39799c177ba4") : secret "canary-serving-cert" not found Apr 17 21:15:43.244293 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:43.244155 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls podName:49118387-7ece-4934-bcd2-c3a2447f3933 nodeName:}" failed. No retries permitted until 2026-04-17 21:16:47.244145859 +0000 UTC m=+162.813658594 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls") pod "dns-default-4d6rl" (UID: "49118387-7ece-4934-bcd2-c3a2447f3933") : secret "dns-default-metrics-tls" not found Apr 17 21:15:43.332585 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:43.332552 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" event={"ID":"e2a830a3-2453-4387-872f-788fbca4b588","Type":"ContainerStarted","Data":"9502b7dfc9d7afba780b19cb574769a204ec77b8099e2f0ff724fc52fccf5cb7"} Apr 17 21:15:43.748473 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:43.748430 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:43.748875 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:43.748602 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:15:43.748875 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:43.748621 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56876bf7b7-b22j9: secret "image-registry-tls" not found Apr 17 21:15:43.748875 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:43.748677 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls podName:61101b23-f011-4fe9-ab1b-ae44f83ebb50 nodeName:}" failed. No retries permitted until 2026-04-17 21:15:51.748661379 +0000 UTC m=+107.318174109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls") pod "image-registry-56876bf7b7-b22j9" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50") : secret "image-registry-tls" not found Apr 17 21:15:44.337473 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:44.337439 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" event={"ID":"e2a830a3-2453-4387-872f-788fbca4b588","Type":"ContainerStarted","Data":"077f8d7af88167a140831be3fedd8db074bde50941f3fa4994dc270a9d99a350"} Apr 17 21:15:44.337473 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:44.337479 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" event={"ID":"e2a830a3-2453-4387-872f-788fbca4b588","Type":"ContainerStarted","Data":"9e4179505062565d9e212bb07848e0c93c7fa939096ac7647f9d06656664fc60"} Apr 17 21:15:44.353975 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:44.353925 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-w92gw" podStartSLOduration=1.436643065 podStartE2EDuration="2.353909004s" podCreationTimestamp="2026-04-17 21:15:42 +0000 UTC" firstStartedPulling="2026-04-17 21:15:42.557820927 +0000 UTC m=+98.127333659" lastFinishedPulling="2026-04-17 21:15:43.475086865 +0000 UTC m=+99.044599598" observedRunningTime="2026-04-17 21:15:44.353129109 +0000 UTC m=+99.922641863" watchObservedRunningTime="2026-04-17 21:15:44.353909004 +0000 UTC m=+99.923421811" Apr 17 21:15:44.949027 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:44.948972 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:44.949027 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:44.949028 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:15:44.949458 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:44.949381 2567 scope.go:117] "RemoveContainer" containerID="a0d640ec347da730095f1b75f0b1f68d26ff647db2c092b9ae76c0d19656e4df" Apr 17 21:15:44.949570 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:44.949553 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2d498_openshift-console-operator(c7d6b966-299f-473e-b704-4ce1b867b0b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" podUID="c7d6b966-299f-473e-b704-4ce1b867b0b5" Apr 17 21:15:45.275161 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:45.275078 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ph67v" Apr 17 21:15:46.573460 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:46.573422 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:15:46.589574 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:46.573543 2567 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:46.589574 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:46.573596 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls podName:699d9724-833b-4266-b2d6-0ae0369b1d91 nodeName:}" failed. No retries permitted until 2026-04-17 21:16:02.573582205 +0000 UTC m=+118.143094936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7wmg7" (UID: "699d9724-833b-4266-b2d6-0ae0369b1d91") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:15:49.597558 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:49.597483 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:49.599875 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:49.599848 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1499bff6-6da6-4cac-bef0-0e6307a5810f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8lmdx\" (UID: \"1499bff6-6da6-4cac-bef0-0e6307a5810f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:49.646906 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:49.646865 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" Apr 17 21:15:49.761420 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:49.761233 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx"] Apr 17 21:15:50.354161 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:50.354116 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" event={"ID":"1499bff6-6da6-4cac-bef0-0e6307a5810f","Type":"ContainerStarted","Data":"260a6ac6f874212d4a8c8380c932027f0734aac1d45b2d6b51998d9cc4e6559c"} Apr 17 21:15:51.816765 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:51.816726 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls\") pod \"image-registry-56876bf7b7-b22j9\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:15:51.817099 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:51.816866 2567 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:15:51.817099 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:51.816883 2567 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56876bf7b7-b22j9: secret "image-registry-tls" not found Apr 17 21:15:51.817099 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:51.816934 2567 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls podName:61101b23-f011-4fe9-ab1b-ae44f83ebb50 nodeName:}" failed. No retries permitted until 2026-04-17 21:16:07.816920268 +0000 UTC m=+123.386432999 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls") pod "image-registry-56876bf7b7-b22j9" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50") : secret "image-registry-tls" not found Apr 17 21:15:52.360718 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:52.360681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" event={"ID":"1499bff6-6da6-4cac-bef0-0e6307a5810f","Type":"ContainerStarted","Data":"91827808befe14076322660916856ca52296338d985ffe5835d16861e7dca056"} Apr 17 21:15:52.360718 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:52.360717 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" event={"ID":"1499bff6-6da6-4cac-bef0-0e6307a5810f","Type":"ContainerStarted","Data":"b8e8cbbac3273f3937fcf6f6aa474563b44d903d332ae082f54fe0aab729c38b"} Apr 17 21:15:52.377340 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:52.377288 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8lmdx" podStartSLOduration=17.455894423 podStartE2EDuration="19.377273257s" podCreationTimestamp="2026-04-17 21:15:33 +0000 UTC" firstStartedPulling="2026-04-17 21:15:49.800382197 +0000 UTC m=+105.369894932" lastFinishedPulling="2026-04-17 21:15:51.721761027 +0000 UTC m=+107.291273766" observedRunningTime="2026-04-17 21:15:52.376900652 +0000 UTC m=+107.946413405" watchObservedRunningTime="2026-04-17 21:15:52.377273257 +0000 UTC m=+107.946786009" Apr 17 21:15:57.050990 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:57.050947 2567 scope.go:117] "RemoveContainer" containerID="a0d640ec347da730095f1b75f0b1f68d26ff647db2c092b9ae76c0d19656e4df" Apr 17 21:15:57.373883 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:57.373856 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:15:57.374242 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:57.374224 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/1.log" Apr 17 21:15:57.374299 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:57.374260 2567 generic.go:358] "Generic (PLEG): container finished" podID="c7d6b966-299f-473e-b704-4ce1b867b0b5" containerID="372ecb8ebe909abaf94b7e33931aa4967a37309d76d249596b69308d36a63701" exitCode=255 Apr 17 21:15:57.374345 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:57.374327 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" event={"ID":"c7d6b966-299f-473e-b704-4ce1b867b0b5","Type":"ContainerDied","Data":"372ecb8ebe909abaf94b7e33931aa4967a37309d76d249596b69308d36a63701"} Apr 17 21:15:57.374395 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:57.374361 2567 scope.go:117] "RemoveContainer" containerID="a0d640ec347da730095f1b75f0b1f68d26ff647db2c092b9ae76c0d19656e4df" Apr 17 21:15:57.374724 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:57.374703 2567 scope.go:117] "RemoveContainer" containerID="372ecb8ebe909abaf94b7e33931aa4967a37309d76d249596b69308d36a63701" Apr 17 21:15:57.374934 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:15:57.374913 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2d498_openshift-console-operator(c7d6b966-299f-473e-b704-4ce1b867b0b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" podUID="c7d6b966-299f-473e-b704-4ce1b867b0b5" Apr 17 21:15:58.378053 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:15:58.378027 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:16:02.602220 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:02.602169 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:16:02.604533 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:02.604488 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/699d9724-833b-4266-b2d6-0ae0369b1d91-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7wmg7\" (UID: \"699d9724-833b-4266-b2d6-0ae0369b1d91\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:16:02.877794 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:02.877700 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" Apr 17 21:16:02.997020 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:02.996984 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7"] Apr 17 21:16:03.000091 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:03.000064 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699d9724_833b_4266_b2d6_0ae0369b1d91.slice/crio-d1c4af7db7dba4e6f22139ea9de24133159e6348dd1d5d31fe08f446f5c1bc41 WatchSource:0}: Error finding container d1c4af7db7dba4e6f22139ea9de24133159e6348dd1d5d31fe08f446f5c1bc41: Status 404 returned error can't find the container with id d1c4af7db7dba4e6f22139ea9de24133159e6348dd1d5d31fe08f446f5c1bc41 Apr 17 21:16:03.396777 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:03.396734 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" event={"ID":"699d9724-833b-4266-b2d6-0ae0369b1d91","Type":"ContainerStarted","Data":"d1c4af7db7dba4e6f22139ea9de24133159e6348dd1d5d31fe08f446f5c1bc41"} Apr 17 21:16:04.948953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:04.948924 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:16:04.948953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:04.948958 2567 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:16:04.949359 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:04.949285 2567 scope.go:117] "RemoveContainer" containerID="372ecb8ebe909abaf94b7e33931aa4967a37309d76d249596b69308d36a63701" Apr 17 21:16:04.949532 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:16:04.949497 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2d498_openshift-console-operator(c7d6b966-299f-473e-b704-4ce1b867b0b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" podUID="c7d6b966-299f-473e-b704-4ce1b867b0b5" Apr 17 21:16:05.402365 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.402300 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" event={"ID":"699d9724-833b-4266-b2d6-0ae0369b1d91","Type":"ContainerStarted","Data":"5b01f2714c5b44d31fb9a0fc2bc74a0e3e852b51c41914e83a3559cd8a43028b"} Apr 17 21:16:05.417214 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.417153 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7wmg7" podStartSLOduration=33.419193779 podStartE2EDuration="35.417136369s" podCreationTimestamp="2026-04-17 21:15:30 +0000 UTC" firstStartedPulling="2026-04-17 21:16:03.002274372 +0000 UTC m=+118.571787103" lastFinishedPulling="2026-04-17 21:16:05.000216946 +0000 UTC m=+120.569729693" observedRunningTime="2026-04-17 21:16:05.416474449 +0000 UTC m=+120.985987225" watchObservedRunningTime="2026-04-17 21:16:05.417136369 +0000 UTC m=+120.986649123" Apr 17 21:16:05.733948 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.733865 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56876bf7b7-b22j9"] Apr 17 21:16:05.734096 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:16:05.734036 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" podUID="61101b23-f011-4fe9-ab1b-ae44f83ebb50" Apr 17 21:16:05.768705 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.768673 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-76c8559dfb-m94ms"] Apr 17 21:16:05.771978 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.771956 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:05.786796 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.786767 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76c8559dfb-m94ms"] Apr 17 21:16:05.841449 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.841412 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9j7wq"] Apr 17 21:16:05.844941 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.844919 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:05.848069 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.848044 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 21:16:05.848182 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.848100 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 21:16:05.848318 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.848304 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5zvnz\"" Apr 17 21:16:05.848408 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.848390 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 21:16:05.848584 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.848566 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 21:16:05.853485 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.853463 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9j7wq"] Apr 17 21:16:05.930338 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.930304 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97a746ea-9b60-4490-927e-dcdccb81be88-trusted-ca\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:05.930505 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.930409 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-bound-sa-token\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:05.930505 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.930441 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm5n6\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-kube-api-access-xm5n6\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:05.930505 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.930494 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-registry-tls\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:05.930647 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.930544 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97a746ea-9b60-4490-927e-dcdccb81be88-image-registry-private-configuration\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:05.930647 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.930568 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97a746ea-9b60-4490-927e-dcdccb81be88-registry-certificates\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:05.930647 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.930588 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97a746ea-9b60-4490-927e-dcdccb81be88-installation-pull-secrets\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:05.930647 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:05.930621 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97a746ea-9b60-4490-927e-dcdccb81be88-ca-trust-extracted\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.031461 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031364 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2d200915-24d1-46aa-9b70-20c8ff4392cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.031461 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031424 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-registry-tls\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031480 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2d200915-24d1-46aa-9b70-20c8ff4392cb-crio-socket\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2d200915-24d1-46aa-9b70-20c8ff4392cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031606 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97a746ea-9b60-4490-927e-dcdccb81be88-image-registry-private-configuration\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031648 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2596\" (UniqueName: \"kubernetes.io/projected/2d200915-24d1-46aa-9b70-20c8ff4392cb-kube-api-access-j2596\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031678 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97a746ea-9b60-4490-927e-dcdccb81be88-registry-certificates\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031713 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97a746ea-9b60-4490-927e-dcdccb81be88-installation-pull-secrets\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031744 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97a746ea-9b60-4490-927e-dcdccb81be88-ca-trust-extracted\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031781 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d200915-24d1-46aa-9b70-20c8ff4392cb-data-volume\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.032010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.031815 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97a746ea-9b60-4490-927e-dcdccb81be88-trusted-ca\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032465 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.032159 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97a746ea-9b60-4490-927e-dcdccb81be88-ca-trust-extracted\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032465 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.032257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-bound-sa-token\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032465 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.032295 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm5n6\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-kube-api-access-xm5n6\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032642 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.032594 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97a746ea-9b60-4490-927e-dcdccb81be88-registry-certificates\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.032726 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.032705 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97a746ea-9b60-4490-927e-dcdccb81be88-trusted-ca\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.034068 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.034047 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-registry-tls\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.034309 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.034293 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/97a746ea-9b60-4490-927e-dcdccb81be88-image-registry-private-configuration\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.034549 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.034513 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97a746ea-9b60-4490-927e-dcdccb81be88-installation-pull-secrets\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.039890 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.039871 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-bound-sa-token\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.040065 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.040045 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm5n6\" (UniqueName: \"kubernetes.io/projected/97a746ea-9b60-4490-927e-dcdccb81be88-kube-api-access-xm5n6\") pod \"image-registry-76c8559dfb-m94ms\" (UID: \"97a746ea-9b60-4490-927e-dcdccb81be88\") " pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.083272 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.083244 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wxtwb\"" Apr 17 21:16:06.091259 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.091234 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.133273 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.133228 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d200915-24d1-46aa-9b70-20c8ff4392cb-data-volume\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.133454 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.133343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2d200915-24d1-46aa-9b70-20c8ff4392cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.133454 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.133396 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2d200915-24d1-46aa-9b70-20c8ff4392cb-crio-socket\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.133454 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.133425 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2d200915-24d1-46aa-9b70-20c8ff4392cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.133656 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.133464 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2596\" (UniqueName: \"kubernetes.io/projected/2d200915-24d1-46aa-9b70-20c8ff4392cb-kube-api-access-j2596\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.133656 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.133566 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2d200915-24d1-46aa-9b70-20c8ff4392cb-crio-socket\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.133763 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.133672 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2d200915-24d1-46aa-9b70-20c8ff4392cb-data-volume\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.134086 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.134059 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2d200915-24d1-46aa-9b70-20c8ff4392cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.137689 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.137641 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2d200915-24d1-46aa-9b70-20c8ff4392cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.144496 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.144447 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2596\" (UniqueName: \"kubernetes.io/projected/2d200915-24d1-46aa-9b70-20c8ff4392cb-kube-api-access-j2596\") pod \"insights-runtime-extractor-9j7wq\" (UID: \"2d200915-24d1-46aa-9b70-20c8ff4392cb\") " pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.154427 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.154395 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9j7wq" Apr 17 21:16:06.216812 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.216631 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-76c8559dfb-m94ms"] Apr 17 21:16:06.218884 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:06.218853 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a746ea_9b60_4490_927e_dcdccb81be88.slice/crio-d2799c1f646fe0d34bd7d7888d51ea887307e74dfcc2e8801ab0a086e8e44c8a WatchSource:0}: Error finding container d2799c1f646fe0d34bd7d7888d51ea887307e74dfcc2e8801ab0a086e8e44c8a: Status 404 returned error can't find the container with id d2799c1f646fe0d34bd7d7888d51ea887307e74dfcc2e8801ab0a086e8e44c8a Apr 17 21:16:06.278975 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.278947 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9j7wq"] Apr 17 21:16:06.281828 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:06.281764 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d200915_24d1_46aa_9b70_20c8ff4392cb.slice/crio-c7ea2a39939980580508299ecfba90a90d80746b31c42aca17bd7dbf94c2023c WatchSource:0}: Error finding container c7ea2a39939980580508299ecfba90a90d80746b31c42aca17bd7dbf94c2023c: Status 404 returned error can't find the container with id c7ea2a39939980580508299ecfba90a90d80746b31c42aca17bd7dbf94c2023c Apr 17 21:16:06.406268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.406230 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9j7wq" event={"ID":"2d200915-24d1-46aa-9b70-20c8ff4392cb","Type":"ContainerStarted","Data":"4091331f28ac67fcd12d0267a0f6a0744d6d24a49c17baa6d63407b089d3f15e"} Apr 17 21:16:06.406447 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.406275 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9j7wq" event={"ID":"2d200915-24d1-46aa-9b70-20c8ff4392cb","Type":"ContainerStarted","Data":"c7ea2a39939980580508299ecfba90a90d80746b31c42aca17bd7dbf94c2023c"} Apr 17 21:16:06.407588 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.407556 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" event={"ID":"97a746ea-9b60-4490-927e-dcdccb81be88","Type":"ContainerStarted","Data":"f7d09bf884c8bf03bbfb0d64d457c2c2b8ff9822161a0cae750e93b10be56a9e"} Apr 17 21:16:06.407720 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.407595 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" event={"ID":"97a746ea-9b60-4490-927e-dcdccb81be88","Type":"ContainerStarted","Data":"d2799c1f646fe0d34bd7d7888d51ea887307e74dfcc2e8801ab0a086e8e44c8a"} Apr 17 21:16:06.407720 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.407608 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:16:06.407884 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.407859 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:06.411959 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.411940 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:16:06.425215 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.425174 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" podStartSLOduration=1.425162019 podStartE2EDuration="1.425162019s" podCreationTimestamp="2026-04-17 21:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:16:06.424680412 +0000 UTC m=+121.994193206" watchObservedRunningTime="2026-04-17 21:16:06.425162019 +0000 UTC m=+121.994674771" Apr 17 21:16:06.537868 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.537769 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61101b23-f011-4fe9-ab1b-ae44f83ebb50-ca-trust-extracted\") pod \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " Apr 17 21:16:06.537868 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.537848 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-bound-sa-token\") pod \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " Apr 17 21:16:06.538088 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.537898 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-image-registry-private-configuration\") pod \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " Apr 17 21:16:06.538088 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.537934 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-installation-pull-secrets\") pod \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " Apr 17 21:16:06.538088 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.537959 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcnj6\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-kube-api-access-qcnj6\") pod \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " Apr 17 21:16:06.538088 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.537982 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-trusted-ca\") pod \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " Apr 17 21:16:06.538088 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.538017 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-certificates\") pod \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\" (UID: \"61101b23-f011-4fe9-ab1b-ae44f83ebb50\") " Apr 17 21:16:06.538088 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.538025 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61101b23-f011-4fe9-ab1b-ae44f83ebb50-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "61101b23-f011-4fe9-ab1b-ae44f83ebb50" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:16:06.538456 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.538391 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "61101b23-f011-4fe9-ab1b-ae44f83ebb50" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:16:06.538456 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.538409 2567 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/61101b23-f011-4fe9-ab1b-ae44f83ebb50-ca-trust-extracted\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:16:06.539428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.539395 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "61101b23-f011-4fe9-ab1b-ae44f83ebb50" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:16:06.540360 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.540332 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "61101b23-f011-4fe9-ab1b-ae44f83ebb50" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:16:06.540555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.540509 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "61101b23-f011-4fe9-ab1b-ae44f83ebb50" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:16:06.540665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.540556 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "61101b23-f011-4fe9-ab1b-ae44f83ebb50" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:16:06.540819 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.540802 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-kube-api-access-qcnj6" (OuterVolumeSpecName: "kube-api-access-qcnj6") pod "61101b23-f011-4fe9-ab1b-ae44f83ebb50" (UID: "61101b23-f011-4fe9-ab1b-ae44f83ebb50"). InnerVolumeSpecName "kube-api-access-qcnj6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:16:06.638813 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.638773 2567 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-bound-sa-token\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:16:06.638813 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.638804 2567 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-image-registry-private-configuration\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:16:06.638813 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.638814 2567 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/61101b23-f011-4fe9-ab1b-ae44f83ebb50-installation-pull-secrets\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:16:06.638813 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.638823 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qcnj6\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-kube-api-access-qcnj6\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:16:06.639061 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.638834 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-trusted-ca\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:16:06.639061 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:06.638843 2567 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-certificates\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:16:07.411607 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:07.411569 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9j7wq" event={"ID":"2d200915-24d1-46aa-9b70-20c8ff4392cb","Type":"ContainerStarted","Data":"b30429509c0d3f0e7c6cdeb8d26ee19f6321527421bc0359e79d5533490a8865"} Apr 17 21:16:07.411607 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:07.411614 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56876bf7b7-b22j9" Apr 17 21:16:07.439838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:07.439808 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56876bf7b7-b22j9"] Apr 17 21:16:07.443356 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:07.443335 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56876bf7b7-b22j9"] Apr 17 21:16:07.545974 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:07.545934 2567 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/61101b23-f011-4fe9-ab1b-ae44f83ebb50-registry-tls\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:16:09.053969 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:09.053923 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61101b23-f011-4fe9-ab1b-ae44f83ebb50" path="/var/lib/kubelet/pods/61101b23-f011-4fe9-ab1b-ae44f83ebb50/volumes" Apr 17 21:16:09.419468 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:09.419432 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9j7wq" event={"ID":"2d200915-24d1-46aa-9b70-20c8ff4392cb","Type":"ContainerStarted","Data":"f7f6044f870a3108992e245e8540ab42895478365c677ce3ed69168d659aba07"} Apr 17 21:16:09.436529 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:09.436465 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9j7wq" podStartSLOduration=1.8282062030000001 podStartE2EDuration="4.436449306s" podCreationTimestamp="2026-04-17 21:16:05 +0000 UTC" firstStartedPulling="2026-04-17 21:16:06.340232308 +0000 UTC m=+121.909745053" lastFinishedPulling="2026-04-17 21:16:08.94847522 +0000 UTC m=+124.517988156" observedRunningTime="2026-04-17 21:16:09.434774355 +0000 UTC m=+125.004287120" watchObservedRunningTime="2026-04-17 21:16:09.436449306 +0000 UTC m=+125.005962060" Apr 17 21:16:12.894314 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.894278 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-27ts9"] Apr 17 21:16:12.897575 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.897553 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.900571 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.900545 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 21:16:12.900712 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.900688 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 21:16:12.900712 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.900700 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 21:16:12.900874 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.900713 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 21:16:12.901653 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.901639 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m8ph4\"" Apr 17 21:16:12.914081 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.914060 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-lvpjg"] Apr 17 21:16:12.916283 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.916265 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:12.919231 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.919204 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-9zsx8\"" Apr 17 21:16:12.919535 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.919508 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 21:16:12.919860 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.919843 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 21:16:12.920031 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.920010 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 21:16:12.925455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.925429 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-lvpjg"] Apr 17 21:16:12.988624 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988577 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwcb7\" (UniqueName: \"kubernetes.io/projected/7bc9046e-be5c-4615-af31-2fa594c57289-kube-api-access-zwcb7\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.988624 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988644 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc95v\" (UniqueName: \"kubernetes.io/projected/a4522390-be08-4472-b1b5-4d1db64090fb-kube-api-access-vc95v\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988700 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-root\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988740 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-tls\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988775 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bc9046e-be5c-4615-af31-2fa594c57289-metrics-client-ca\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988792 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988864 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4522390-be08-4472-b1b5-4d1db64090fb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988907 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a4522390-be08-4472-b1b5-4d1db64090fb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988937 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:12.988964 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.988960 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-sys\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.989321 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.989037 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-wtmp\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.989321 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.989063 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-accelerators-collector-config\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:12.989321 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.989092 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:12.989321 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:12.989123 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-textfile\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.089537 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-wtmp\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.089537 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089542 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-accelerators-collector-config\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.089781 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089664 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.089781 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089714 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-textfile\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.089781 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089675 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-wtmp\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.089938 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089790 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwcb7\" (UniqueName: \"kubernetes.io/projected/7bc9046e-be5c-4615-af31-2fa594c57289-kube-api-access-zwcb7\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.089938 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089816 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.089938 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089834 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vc95v\" (UniqueName: \"kubernetes.io/projected/a4522390-be08-4472-b1b5-4d1db64090fb-kube-api-access-vc95v\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.089938 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089858 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-root\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.089938 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089904 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-tls\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.090178 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089934 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-root\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.090178 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089945 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bc9046e-be5c-4615-af31-2fa594c57289-metrics-client-ca\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.090178 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.089971 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.090178 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090024 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4522390-be08-4472-b1b5-4d1db64090fb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.090178 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090057 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-accelerators-collector-config\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.090178 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090068 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a4522390-be08-4472-b1b5-4d1db64090fb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.090178 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090112 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.090178 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090140 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-sys\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.090596 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090223 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bc9046e-be5c-4615-af31-2fa594c57289-sys\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.090596 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090452 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a4522390-be08-4472-b1b5-4d1db64090fb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.090596 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090484 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.090596 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090486 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-textfile\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.090807 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.090596 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bc9046e-be5c-4615-af31-2fa594c57289-metrics-client-ca\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.091103 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.091079 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a4522390-be08-4472-b1b5-4d1db64090fb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.092349 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.092322 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.092456 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.092413 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.092544 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.092511 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4522390-be08-4472-b1b5-4d1db64090fb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.092673 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.092652 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7bc9046e-be5c-4615-af31-2fa594c57289-node-exporter-tls\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.097020 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.096999 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwcb7\" (UniqueName: \"kubernetes.io/projected/7bc9046e-be5c-4615-af31-2fa594c57289-kube-api-access-zwcb7\") pod \"node-exporter-27ts9\" (UID: \"7bc9046e-be5c-4615-af31-2fa594c57289\") " pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.097331 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.097311 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc95v\" (UniqueName: \"kubernetes.io/projected/a4522390-be08-4472-b1b5-4d1db64090fb-kube-api-access-vc95v\") pod \"kube-state-metrics-69db897b98-lvpjg\" (UID: \"a4522390-be08-4472-b1b5-4d1db64090fb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.207215 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.207124 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-27ts9" Apr 17 21:16:13.214969 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:13.214940 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bc9046e_be5c_4615_af31_2fa594c57289.slice/crio-378508686c11faa63f6b993f966d878b7a03e14eba6cc0ee10a21042180ce752 WatchSource:0}: Error finding container 378508686c11faa63f6b993f966d878b7a03e14eba6cc0ee10a21042180ce752: Status 404 returned error can't find the container with id 378508686c11faa63f6b993f966d878b7a03e14eba6cc0ee10a21042180ce752 Apr 17 21:16:13.230252 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.230225 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" Apr 17 21:16:13.348416 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.348381 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-lvpjg"] Apr 17 21:16:13.350971 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:13.350941 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4522390_be08_4472_b1b5_4d1db64090fb.slice/crio-2f7287e97d182582a552ac172fd3a6ebb5e77a51eeab4a3585af1323d3dade4a WatchSource:0}: Error finding container 2f7287e97d182582a552ac172fd3a6ebb5e77a51eeab4a3585af1323d3dade4a: Status 404 returned error can't find the container with id 2f7287e97d182582a552ac172fd3a6ebb5e77a51eeab4a3585af1323d3dade4a Apr 17 21:16:13.430651 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.430612 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" event={"ID":"a4522390-be08-4472-b1b5-4d1db64090fb","Type":"ContainerStarted","Data":"2f7287e97d182582a552ac172fd3a6ebb5e77a51eeab4a3585af1323d3dade4a"} Apr 17 21:16:13.431697 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.431659 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-27ts9" event={"ID":"7bc9046e-be5c-4615-af31-2fa594c57289","Type":"ContainerStarted","Data":"378508686c11faa63f6b993f966d878b7a03e14eba6cc0ee10a21042180ce752"} Apr 17 21:16:13.952488 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.952448 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:16:13.955768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.955738 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.958172 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.958128 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 21:16:13.958445 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.958389 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 21:16:13.958445 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.958409 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 21:16:13.958978 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.958800 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 21:16:13.958978 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.958853 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 21:16:13.958978 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.958865 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 21:16:13.959164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.958988 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 21:16:13.959164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.959106 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 21:16:13.959164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.959144 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m7n8s\"" Apr 17 21:16:13.959344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.959326 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 21:16:13.967948 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.967887 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:16:13.998494 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998453 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.998691 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998507 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.998691 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998650 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.998801 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998689 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.998801 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998759 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.998801 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998784 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.998943 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998809 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dtjs\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-kube-api-access-6dtjs\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.998990 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998929 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.998990 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998969 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.999083 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.998998 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.999083 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.999076 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-volume\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.999181 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.999116 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-out\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:13.999181 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:13.999147 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-web-config\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100237 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100290 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100317 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dtjs\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-kube-api-access-6dtjs\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100350 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100379 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100406 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100463 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-volume\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-out\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.100553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100530 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-web-config\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.101264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100582 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.101264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100609 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.101264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100644 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.101264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.100680 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.101264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.101061 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.101674 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.101462 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.101777 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.101757 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.104182 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.103748 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.104182 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.104143 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.104573 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.104553 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-web-config\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.105264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.105237 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.127014 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.119807 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-out\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.127014 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.120244 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.127014 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.123114 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-volume\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.127014 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.123810 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.134994 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.134713 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.136782 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.136746 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dtjs\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-kube-api-access-6dtjs\") pod \"alertmanager-main-0\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.270000 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.269910 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:16:14.416667 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.416631 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:16:14.436614 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.436574 2567 generic.go:358] "Generic (PLEG): container finished" podID="7bc9046e-be5c-4615-af31-2fa594c57289" containerID="a3d2d9d364ecc53d48f0bffc3a3094ec1c7bdcb3a155e17b753db9a1403f493d" exitCode=0 Apr 17 21:16:14.436779 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.436648 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-27ts9" event={"ID":"7bc9046e-be5c-4615-af31-2fa594c57289","Type":"ContainerDied","Data":"a3d2d9d364ecc53d48f0bffc3a3094ec1c7bdcb3a155e17b753db9a1403f493d"} Apr 17 21:16:14.656757 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:14.656720 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b4b530_d97b_4a1f_af94_60a53cc3202d.slice/crio-75f86ffc9d3d0a874841094eee5cfda0e7645d75ea83a84797a68688995a0471 WatchSource:0}: Error finding container 75f86ffc9d3d0a874841094eee5cfda0e7645d75ea83a84797a68688995a0471: Status 404 returned error can't find the container with id 75f86ffc9d3d0a874841094eee5cfda0e7645d75ea83a84797a68688995a0471 Apr 17 21:16:14.907799 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.907699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:16:14.910401 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:14.910377 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95290018-54bc-46b1-8b24-b0bae6086a51-metrics-certs\") pod \"network-metrics-daemon-ndfzt\" (UID: \"95290018-54bc-46b1-8b24-b0bae6086a51\") " pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:16:15.166873 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.166780 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zzp4n\"" Apr 17 21:16:15.174875 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.174841 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ndfzt" Apr 17 21:16:15.309594 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.309454 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ndfzt"] Apr 17 21:16:15.312637 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:15.312601 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95290018_54bc_46b1_8b24_b0bae6086a51.slice/crio-96cd455bb3ba65ee4e9d05bcdbe44149161efcc39032fcd54ae0f4979668b136 WatchSource:0}: Error finding container 96cd455bb3ba65ee4e9d05bcdbe44149161efcc39032fcd54ae0f4979668b136: Status 404 returned error can't find the container with id 96cd455bb3ba65ee4e9d05bcdbe44149161efcc39032fcd54ae0f4979668b136 Apr 17 21:16:15.442393 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.442304 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ndfzt" event={"ID":"95290018-54bc-46b1-8b24-b0bae6086a51","Type":"ContainerStarted","Data":"96cd455bb3ba65ee4e9d05bcdbe44149161efcc39032fcd54ae0f4979668b136"} Apr 17 21:16:15.444631 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.444581 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" event={"ID":"a4522390-be08-4472-b1b5-4d1db64090fb","Type":"ContainerStarted","Data":"ec571afb2cb73fb11082f115d72d2b1b29bd583563da40983f48c291d91b2712"} Apr 17 21:16:15.444631 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.444626 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" event={"ID":"a4522390-be08-4472-b1b5-4d1db64090fb","Type":"ContainerStarted","Data":"5f56438b40ad84eb90edb98a9945b64089ae00af549c2ee0fb99d3c4e993c294"} Apr 17 21:16:15.444866 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.444642 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" event={"ID":"a4522390-be08-4472-b1b5-4d1db64090fb","Type":"ContainerStarted","Data":"9065f5b3fe7484bbd573eee161cad3b509b822cd13fcb24035352b8e81cf0754"} Apr 17 21:16:15.447021 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.446988 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-27ts9" event={"ID":"7bc9046e-be5c-4615-af31-2fa594c57289","Type":"ContainerStarted","Data":"aa8437cff4f83b07f6d2d8a7b5b475ee044bd521554ec9a6125d0102bb834a83"} Apr 17 21:16:15.447183 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.447028 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-27ts9" event={"ID":"7bc9046e-be5c-4615-af31-2fa594c57289","Type":"ContainerStarted","Data":"3b76be155dd3adbad2da89c74758486d1d378c6842d1f2e180e6576c278cd395"} Apr 17 21:16:15.448214 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.448182 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerStarted","Data":"75f86ffc9d3d0a874841094eee5cfda0e7645d75ea83a84797a68688995a0471"} Apr 17 21:16:15.462659 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.462604 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-lvpjg" podStartSLOduration=2.124028326 podStartE2EDuration="3.462585878s" podCreationTimestamp="2026-04-17 21:16:12 +0000 UTC" firstStartedPulling="2026-04-17 21:16:13.352771559 +0000 UTC m=+128.922284290" lastFinishedPulling="2026-04-17 21:16:14.691329109 +0000 UTC m=+130.260841842" observedRunningTime="2026-04-17 21:16:15.461385414 +0000 UTC m=+131.030898169" watchObservedRunningTime="2026-04-17 21:16:15.462585878 +0000 UTC m=+131.032098634" Apr 17 21:16:15.481479 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:15.481417 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-27ts9" podStartSLOduration=2.613814671 podStartE2EDuration="3.481389361s" podCreationTimestamp="2026-04-17 21:16:12 +0000 UTC" firstStartedPulling="2026-04-17 21:16:13.216755945 +0000 UTC m=+128.786268676" lastFinishedPulling="2026-04-17 21:16:14.084330635 +0000 UTC m=+129.653843366" observedRunningTime="2026-04-17 21:16:15.48099552 +0000 UTC m=+131.050508286" watchObservedRunningTime="2026-04-17 21:16:15.481389361 +0000 UTC m=+131.050902116" Apr 17 21:16:16.452637 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:16.452596 2567 generic.go:358] "Generic (PLEG): container finished" podID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerID="c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20" exitCode=0 Apr 17 21:16:16.453095 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:16.452666 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerDied","Data":"c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20"} Apr 17 21:16:17.051380 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.051301 2567 scope.go:117] "RemoveContainer" containerID="372ecb8ebe909abaf94b7e33931aa4967a37309d76d249596b69308d36a63701" Apr 17 21:16:17.051552 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:16:17.051480 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2d498_openshift-console-operator(c7d6b966-299f-473e-b704-4ce1b867b0b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" podUID="c7d6b966-299f-473e-b704-4ce1b867b0b5" Apr 17 21:16:17.202643 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.202606 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-578cd5b9d8-6lxxg"] Apr 17 21:16:17.204762 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.204744 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.207263 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.207239 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 21:16:17.207263 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.207251 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-5hglt\"" Apr 17 21:16:17.207404 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.207251 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 21:16:17.208296 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.208278 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 21:16:17.208360 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.208292 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 21:16:17.208360 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.208295 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-6qo1543u9n09n\"" Apr 17 21:16:17.217318 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.217293 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-578cd5b9d8-6lxxg"] Apr 17 21:16:17.231053 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.231030 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f902dfc2-4680-4303-9548-92e70e5538b0-metrics-server-audit-profiles\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.231170 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.231066 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-secret-metrics-server-client-certs\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.231170 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.231140 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f902dfc2-4680-4303-9548-92e70e5538b0-audit-log\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.231238 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.231190 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-secret-metrics-server-tls\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.231238 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.231209 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzdf\" (UniqueName: \"kubernetes.io/projected/f902dfc2-4680-4303-9548-92e70e5538b0-kube-api-access-rnzdf\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.231299 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.231275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f902dfc2-4680-4303-9548-92e70e5538b0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.231334 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.231309 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-client-ca-bundle\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332057 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.331964 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f902dfc2-4680-4303-9548-92e70e5538b0-metrics-server-audit-profiles\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332057 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.332019 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-secret-metrics-server-client-certs\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332274 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.332062 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f902dfc2-4680-4303-9548-92e70e5538b0-audit-log\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332274 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.332111 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-secret-metrics-server-tls\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332274 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.332135 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnzdf\" (UniqueName: \"kubernetes.io/projected/f902dfc2-4680-4303-9548-92e70e5538b0-kube-api-access-rnzdf\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332274 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.332179 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f902dfc2-4680-4303-9548-92e70e5538b0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332274 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.332206 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-client-ca-bundle\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332550 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.332487 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f902dfc2-4680-4303-9548-92e70e5538b0-audit-log\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.332957 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.332928 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f902dfc2-4680-4303-9548-92e70e5538b0-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.333099 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.333069 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f902dfc2-4680-4303-9548-92e70e5538b0-metrics-server-audit-profiles\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.334717 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.334687 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-secret-metrics-server-client-certs\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.334807 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.334700 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-client-ca-bundle\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.334807 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.334802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f902dfc2-4680-4303-9548-92e70e5538b0-secret-metrics-server-tls\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.340689 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.340670 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnzdf\" (UniqueName: \"kubernetes.io/projected/f902dfc2-4680-4303-9548-92e70e5538b0-kube-api-access-rnzdf\") pod \"metrics-server-578cd5b9d8-6lxxg\" (UID: \"f902dfc2-4680-4303-9548-92e70e5538b0\") " pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.458301 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.458267 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ndfzt" event={"ID":"95290018-54bc-46b1-8b24-b0bae6086a51","Type":"ContainerStarted","Data":"937cbff118b8a4bd52c8a88ae23a594c14e05413ba14d005bc092d0e387d560f"} Apr 17 21:16:17.458301 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.458302 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ndfzt" event={"ID":"95290018-54bc-46b1-8b24-b0bae6086a51","Type":"ContainerStarted","Data":"bcd21a1a9e0be538a49660e302227c79f45384baf9774719b785bafc74ea98ef"} Apr 17 21:16:17.474741 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.474692 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ndfzt" podStartSLOduration=130.996020343 podStartE2EDuration="2m12.474676129s" podCreationTimestamp="2026-04-17 21:14:05 +0000 UTC" firstStartedPulling="2026-04-17 21:16:15.31523176 +0000 UTC m=+130.884744510" lastFinishedPulling="2026-04-17 21:16:16.793887564 +0000 UTC m=+132.363400296" observedRunningTime="2026-04-17 21:16:17.47320989 +0000 UTC m=+133.042722670" watchObservedRunningTime="2026-04-17 21:16:17.474676129 +0000 UTC m=+133.044188875" Apr 17 21:16:17.513559 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.513504 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:17.646793 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:17.646753 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-578cd5b9d8-6lxxg"] Apr 17 21:16:17.650770 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:17.650728 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf902dfc2_4680_4303_9548_92e70e5538b0.slice/crio-9b69dbb2fbfd8d71bafcbeaae987b971f1cc182188ce1448bd03d71924549cd5 WatchSource:0}: Error finding container 9b69dbb2fbfd8d71bafcbeaae987b971f1cc182188ce1448bd03d71924549cd5: Status 404 returned error can't find the container with id 9b69dbb2fbfd8d71bafcbeaae987b971f1cc182188ce1448bd03d71924549cd5 Apr 17 21:16:18.465152 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:18.465108 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerStarted","Data":"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94"} Apr 17 21:16:18.465665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:18.465173 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerStarted","Data":"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8"} Apr 17 21:16:18.465665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:18.465186 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerStarted","Data":"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9"} Apr 17 21:16:18.465665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:18.465194 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerStarted","Data":"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75"} Apr 17 21:16:18.465665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:18.465206 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerStarted","Data":"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b"} Apr 17 21:16:18.466719 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:18.466687 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" event={"ID":"f902dfc2-4680-4303-9548-92e70e5538b0","Type":"ContainerStarted","Data":"9b69dbb2fbfd8d71bafcbeaae987b971f1cc182188ce1448bd03d71924549cd5"} Apr 17 21:16:19.473651 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:19.473509 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerStarted","Data":"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad"} Apr 17 21:16:19.474959 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:19.474930 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" event={"ID":"f902dfc2-4680-4303-9548-92e70e5538b0","Type":"ContainerStarted","Data":"d22a2e534ff37095fac98d1d54031b518ea8fae1d5aa40fb70f66575cf801f34"} Apr 17 21:16:19.506111 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:19.506055 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.810072248 podStartE2EDuration="6.506039691s" podCreationTimestamp="2026-04-17 21:16:13 +0000 UTC" firstStartedPulling="2026-04-17 21:16:14.658969396 +0000 UTC m=+130.228482129" lastFinishedPulling="2026-04-17 21:16:19.354936826 +0000 UTC m=+134.924449572" observedRunningTime="2026-04-17 21:16:19.5044893 +0000 UTC m=+135.074002068" watchObservedRunningTime="2026-04-17 21:16:19.506039691 +0000 UTC m=+135.075552443" Apr 17 21:16:19.530621 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:19.530558 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" podStartSLOduration=1.010994005 podStartE2EDuration="2.530537803s" podCreationTimestamp="2026-04-17 21:16:17 +0000 UTC" firstStartedPulling="2026-04-17 21:16:17.652691892 +0000 UTC m=+133.222204638" lastFinishedPulling="2026-04-17 21:16:19.172235704 +0000 UTC m=+134.741748436" observedRunningTime="2026-04-17 21:16:19.529003485 +0000 UTC m=+135.098516275" watchObservedRunningTime="2026-04-17 21:16:19.530537803 +0000 UTC m=+135.100050555" Apr 17 21:16:27.415869 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:27.415836 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-76c8559dfb-m94ms" Apr 17 21:16:28.051363 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.051328 2567 scope.go:117] "RemoveContainer" containerID="372ecb8ebe909abaf94b7e33931aa4967a37309d76d249596b69308d36a63701" Apr 17 21:16:28.506126 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.506090 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:16:28.506584 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.506174 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" event={"ID":"c7d6b966-299f-473e-b704-4ce1b867b0b5","Type":"ContainerStarted","Data":"7074e3a9dee11e3e08b85c9b545548fc5daf5f9c6a0af2a10a72e49766b935d4"} Apr 17 21:16:28.507207 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.507175 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:16:28.511708 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.511686 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" Apr 17 21:16:28.537584 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.537536 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2d498" podStartSLOduration=52.654689084 podStartE2EDuration="54.537506357s" podCreationTimestamp="2026-04-17 21:15:34 +0000 UTC" firstStartedPulling="2026-04-17 21:15:35.081168153 +0000 UTC m=+90.650680884" lastFinishedPulling="2026-04-17 21:15:36.96398542 +0000 UTC m=+92.533498157" observedRunningTime="2026-04-17 21:16:28.535561656 +0000 UTC m=+144.105074411" watchObservedRunningTime="2026-04-17 21:16:28.537506357 +0000 UTC m=+144.107019112" Apr 17 21:16:28.630225 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.630189 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-tn47b"] Apr 17 21:16:28.632614 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.632598 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tn47b" Apr 17 21:16:28.635135 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.635114 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 21:16:28.635135 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.635120 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-96c2q\"" Apr 17 21:16:28.635286 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.635242 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 21:16:28.641834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.641809 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tn47b"] Apr 17 21:16:28.737855 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.737818 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msbd4\" (UniqueName: \"kubernetes.io/projected/2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531-kube-api-access-msbd4\") pod \"downloads-6bcc868b7-tn47b\" (UID: \"2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531\") " pod="openshift-console/downloads-6bcc868b7-tn47b" Apr 17 21:16:28.839095 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.838999 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msbd4\" (UniqueName: \"kubernetes.io/projected/2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531-kube-api-access-msbd4\") pod \"downloads-6bcc868b7-tn47b\" (UID: \"2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531\") " pod="openshift-console/downloads-6bcc868b7-tn47b" Apr 17 21:16:28.846726 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.846702 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msbd4\" (UniqueName: \"kubernetes.io/projected/2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531-kube-api-access-msbd4\") pod \"downloads-6bcc868b7-tn47b\" (UID: \"2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531\") " pod="openshift-console/downloads-6bcc868b7-tn47b" Apr 17 21:16:28.943318 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:28.943270 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tn47b" Apr 17 21:16:29.062669 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:29.062475 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tn47b"] Apr 17 21:16:29.064989 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:29.064961 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d3a0e1b_9289_4b0f_8208_4ce9ce9a9531.slice/crio-f438ebf11a95be6bdca95ac3fe620fea3a674229657a7df4ae4407d71ea86541 WatchSource:0}: Error finding container f438ebf11a95be6bdca95ac3fe620fea3a674229657a7df4ae4407d71ea86541: Status 404 returned error can't find the container with id f438ebf11a95be6bdca95ac3fe620fea3a674229657a7df4ae4407d71ea86541 Apr 17 21:16:29.509866 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:29.509831 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tn47b" event={"ID":"2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531","Type":"ContainerStarted","Data":"f438ebf11a95be6bdca95ac3fe620fea3a674229657a7df4ae4407d71ea86541"} Apr 17 21:16:37.513630 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:37.513592 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:37.514109 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:37.513645 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:38.756230 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.756191 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68b7bd9ff5-kh7mv"] Apr 17 21:16:38.760087 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.760062 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.762653 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.762629 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 21:16:38.762809 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.762786 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 21:16:38.762893 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.762877 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 21:16:38.762952 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.762934 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 21:16:38.763706 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.763674 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 21:16:38.763807 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.763750 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wnz9q\"" Apr 17 21:16:38.769913 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.769893 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68b7bd9ff5-kh7mv"] Apr 17 21:16:38.836470 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.836429 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-oauth-config\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.836675 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.836480 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-oauth-serving-cert\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.836675 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.836539 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4sw4\" (UniqueName: \"kubernetes.io/projected/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-kube-api-access-r4sw4\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.836675 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.836627 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-config\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.836777 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.836705 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-serving-cert\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.836777 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.836732 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-service-ca\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.938050 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.938013 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-service-ca\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.938254 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.938085 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-oauth-config\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.938254 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.938225 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-oauth-serving-cert\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.938370 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.938256 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4sw4\" (UniqueName: \"kubernetes.io/projected/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-kube-api-access-r4sw4\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.938370 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.938343 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-config\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.938473 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.938407 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-serving-cert\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.938973 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.938944 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-oauth-serving-cert\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.938973 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.938961 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-service-ca\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.939933 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.939909 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-config\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.940971 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.940948 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-oauth-config\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.941053 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.941024 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-serving-cert\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:38.947243 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:38.947208 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4sw4\" (UniqueName: \"kubernetes.io/projected/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-kube-api-access-r4sw4\") pod \"console-68b7bd9ff5-kh7mv\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:39.073847 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:39.073735 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:42.352871 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:16:42.352824 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4d6rl" podUID="49118387-7ece-4934-bcd2-c3a2447f3933" Apr 17 21:16:42.377122 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:16:42.377078 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-l25qj" podUID="46806929-68d7-4f2b-a1a4-39799c177ba4" Apr 17 21:16:42.548699 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:42.548668 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:16:42.548897 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:42.548844 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4d6rl" Apr 17 21:16:44.585310 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:44.585282 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68b7bd9ff5-kh7mv"] Apr 17 21:16:44.590666 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:44.590642 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef96d27_a7d8_4bd4_ba1d_055b1da95988.slice/crio-626886a1f22c52b0c25fa9cf3d0fe2577589e3ad3e2b183320286ef05fa7b9ad WatchSource:0}: Error finding container 626886a1f22c52b0c25fa9cf3d0fe2577589e3ad3e2b183320286ef05fa7b9ad: Status 404 returned error can't find the container with id 626886a1f22c52b0c25fa9cf3d0fe2577589e3ad3e2b183320286ef05fa7b9ad Apr 17 21:16:45.559930 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:45.559856 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tn47b" event={"ID":"2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531","Type":"ContainerStarted","Data":"f115cd62f5b7a775712786e58b1a5821040a4910c9adeaa6f89dad6c3666811a"} Apr 17 21:16:45.560145 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:45.560068 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-tn47b" Apr 17 21:16:45.561102 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:45.561075 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b7bd9ff5-kh7mv" event={"ID":"4ef96d27-a7d8-4bd4-ba1d-055b1da95988","Type":"ContainerStarted","Data":"626886a1f22c52b0c25fa9cf3d0fe2577589e3ad3e2b183320286ef05fa7b9ad"} Apr 17 21:16:45.567705 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:45.567670 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-tn47b" Apr 17 21:16:45.578451 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:45.577886 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-tn47b" podStartSLOduration=2.086220578 podStartE2EDuration="17.577871017s" podCreationTimestamp="2026-04-17 21:16:28 +0000 UTC" firstStartedPulling="2026-04-17 21:16:29.066745123 +0000 UTC m=+144.636257853" lastFinishedPulling="2026-04-17 21:16:44.558395546 +0000 UTC m=+160.127908292" observedRunningTime="2026-04-17 21:16:45.57549291 +0000 UTC m=+161.145005663" watchObservedRunningTime="2026-04-17 21:16:45.577871017 +0000 UTC m=+161.147383774" Apr 17 21:16:47.321709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:47.321601 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:16:47.321709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:47.321663 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:16:47.324338 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:47.324308 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49118387-7ece-4934-bcd2-c3a2447f3933-metrics-tls\") pod \"dns-default-4d6rl\" (UID: \"49118387-7ece-4934-bcd2-c3a2447f3933\") " pod="openshift-dns/dns-default-4d6rl" Apr 17 21:16:47.324483 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:47.324412 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46806929-68d7-4f2b-a1a4-39799c177ba4-cert\") pod \"ingress-canary-l25qj\" (UID: \"46806929-68d7-4f2b-a1a4-39799c177ba4\") " pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:16:47.351967 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:47.351933 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ggwsl\"" Apr 17 21:16:47.353017 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:47.352985 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-k464c\"" Apr 17 21:16:47.360126 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:47.360056 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4d6rl" Apr 17 21:16:47.360880 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:47.360101 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-l25qj" Apr 17 21:16:48.346158 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.344824 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bfcd69665-nz7jn"] Apr 17 21:16:48.427374 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.427338 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bfcd69665-nz7jn"] Apr 17 21:16:48.427572 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.427498 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.432622 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.432575 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-serving-cert\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.432770 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.432625 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-oauth-serving-cert\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.432770 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.432711 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-oauth-config\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.432770 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.432764 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-service-ca\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.432942 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.432804 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp8qv\" (UniqueName: \"kubernetes.io/projected/215e7a1f-e930-47c1-80f1-4820191852ae-kube-api-access-qp8qv\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.432942 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.432853 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-console-config\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.432942 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.432925 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-trusted-ca-bundle\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.436432 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.436201 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 21:16:48.533735 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.533699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-serving-cert\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.533735 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.533737 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-oauth-serving-cert\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.533994 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.533797 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-oauth-config\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.533994 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.533824 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-service-ca\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.533994 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.533855 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp8qv\" (UniqueName: \"kubernetes.io/projected/215e7a1f-e930-47c1-80f1-4820191852ae-kube-api-access-qp8qv\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.533994 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.533884 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-console-config\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.533994 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.533913 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-trusted-ca-bundle\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.535044 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.534858 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-oauth-serving-cert\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.535044 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.534908 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-service-ca\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.535044 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.534986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-console-config\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.535603 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.535580 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-trusted-ca-bundle\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.536668 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.536649 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-oauth-config\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.536771 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.536697 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-serving-cert\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.541953 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.541931 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp8qv\" (UniqueName: \"kubernetes.io/projected/215e7a1f-e930-47c1-80f1-4820191852ae-kube-api-access-qp8qv\") pod \"console-bfcd69665-nz7jn\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.733180 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.733106 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-l25qj"] Apr 17 21:16:48.735654 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:48.735613 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46806929_68d7_4f2b_a1a4_39799c177ba4.slice/crio-c9d48ba6b54572b69b3604f2d2c57efa7937c7f4b9b692d74e50f9b7fd2a3d7b WatchSource:0}: Error finding container c9d48ba6b54572b69b3604f2d2c57efa7937c7f4b9b692d74e50f9b7fd2a3d7b: Status 404 returned error can't find the container with id c9d48ba6b54572b69b3604f2d2c57efa7937c7f4b9b692d74e50f9b7fd2a3d7b Apr 17 21:16:48.740698 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.740673 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:48.751783 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.751757 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4d6rl"] Apr 17 21:16:48.755511 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:48.755484 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49118387_7ece_4934_bcd2_c3a2447f3933.slice/crio-1a07a797c3bf3ddf3fd9288e9e8a2f2c25a967d74ccb464397bce1913c881702 WatchSource:0}: Error finding container 1a07a797c3bf3ddf3fd9288e9e8a2f2c25a967d74ccb464397bce1913c881702: Status 404 returned error can't find the container with id 1a07a797c3bf3ddf3fd9288e9e8a2f2c25a967d74ccb464397bce1913c881702 Apr 17 21:16:48.879645 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:48.879567 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bfcd69665-nz7jn"] Apr 17 21:16:48.981424 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:16:48.981372 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215e7a1f_e930_47c1_80f1_4820191852ae.slice/crio-618ee2e431436e17be5909aa883d7d93cfce0de837c8a92d2a76d2eeea7325ef WatchSource:0}: Error finding container 618ee2e431436e17be5909aa883d7d93cfce0de837c8a92d2a76d2eeea7325ef: Status 404 returned error can't find the container with id 618ee2e431436e17be5909aa883d7d93cfce0de837c8a92d2a76d2eeea7325ef Apr 17 21:16:49.576591 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:49.576549 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l25qj" event={"ID":"46806929-68d7-4f2b-a1a4-39799c177ba4","Type":"ContainerStarted","Data":"c9d48ba6b54572b69b3604f2d2c57efa7937c7f4b9b692d74e50f9b7fd2a3d7b"} Apr 17 21:16:49.578626 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:49.578571 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfcd69665-nz7jn" event={"ID":"215e7a1f-e930-47c1-80f1-4820191852ae","Type":"ContainerStarted","Data":"bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b"} Apr 17 21:16:49.578626 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:49.578608 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfcd69665-nz7jn" event={"ID":"215e7a1f-e930-47c1-80f1-4820191852ae","Type":"ContainerStarted","Data":"618ee2e431436e17be5909aa883d7d93cfce0de837c8a92d2a76d2eeea7325ef"} Apr 17 21:16:49.581455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:49.581428 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b7bd9ff5-kh7mv" event={"ID":"4ef96d27-a7d8-4bd4-ba1d-055b1da95988","Type":"ContainerStarted","Data":"7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab"} Apr 17 21:16:49.583187 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:49.583159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4d6rl" event={"ID":"49118387-7ece-4934-bcd2-c3a2447f3933","Type":"ContainerStarted","Data":"1a07a797c3bf3ddf3fd9288e9e8a2f2c25a967d74ccb464397bce1913c881702"} Apr 17 21:16:49.598507 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:49.597085 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bfcd69665-nz7jn" podStartSLOduration=1.597066167 podStartE2EDuration="1.597066167s" podCreationTimestamp="2026-04-17 21:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:16:49.595291425 +0000 UTC m=+165.164804178" watchObservedRunningTime="2026-04-17 21:16:49.597066167 +0000 UTC m=+165.166578921" Apr 17 21:16:49.613301 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:49.612184 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68b7bd9ff5-kh7mv" podStartSLOduration=7.193618129 podStartE2EDuration="11.612142981s" podCreationTimestamp="2026-04-17 21:16:38 +0000 UTC" firstStartedPulling="2026-04-17 21:16:44.592704588 +0000 UTC m=+160.162217319" lastFinishedPulling="2026-04-17 21:16:49.011229425 +0000 UTC m=+164.580742171" observedRunningTime="2026-04-17 21:16:49.612094691 +0000 UTC m=+165.181607436" watchObservedRunningTime="2026-04-17 21:16:49.612142981 +0000 UTC m=+165.181655735" Apr 17 21:16:52.600102 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:52.600063 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-l25qj" event={"ID":"46806929-68d7-4f2b-a1a4-39799c177ba4","Type":"ContainerStarted","Data":"328160e85ce2fefcd016d13df174ecd71086dd0c751b002edad37d72a0389f52"} Apr 17 21:16:52.602855 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:52.602822 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4d6rl" event={"ID":"49118387-7ece-4934-bcd2-c3a2447f3933","Type":"ContainerStarted","Data":"9b02335a96ea8f225fe5fc7d3ab5a6dcb308de89dce705314e30248769a5a567"} Apr 17 21:16:52.602855 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:52.602858 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4d6rl" event={"ID":"49118387-7ece-4934-bcd2-c3a2447f3933","Type":"ContainerStarted","Data":"1b6eab3a016e2b0347be9c31a028172bbff875679b06898f323447f32222cb60"} Apr 17 21:16:52.603071 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:52.603050 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4d6rl" Apr 17 21:16:52.616856 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:52.616798 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-l25qj" podStartSLOduration=130.689570732 podStartE2EDuration="2m13.616778447s" podCreationTimestamp="2026-04-17 21:14:39 +0000 UTC" firstStartedPulling="2026-04-17 21:16:48.738016828 +0000 UTC m=+164.307529560" lastFinishedPulling="2026-04-17 21:16:51.665224542 +0000 UTC m=+167.234737275" observedRunningTime="2026-04-17 21:16:52.615568624 +0000 UTC m=+168.185081375" watchObservedRunningTime="2026-04-17 21:16:52.616778447 +0000 UTC m=+168.186291200" Apr 17 21:16:52.633126 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:52.633050 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4d6rl" podStartSLOduration=130.730964804 podStartE2EDuration="2m13.633032008s" podCreationTimestamp="2026-04-17 21:14:39 +0000 UTC" firstStartedPulling="2026-04-17 21:16:48.757820248 +0000 UTC m=+164.327332992" lastFinishedPulling="2026-04-17 21:16:51.659887445 +0000 UTC m=+167.229400196" observedRunningTime="2026-04-17 21:16:52.630809681 +0000 UTC m=+168.200322434" watchObservedRunningTime="2026-04-17 21:16:52.633032008 +0000 UTC m=+168.202544762" Apr 17 21:16:57.519106 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:57.519032 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:57.522902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:57.522876 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-578cd5b9d8-6lxxg" Apr 17 21:16:58.741382 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:58.741345 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:58.741382 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:58.741389 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:58.746289 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:58.746266 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:59.074135 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:59.074057 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:59.074135 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:59.074094 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:59.078715 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:59.078692 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:59.632786 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:59.632760 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:16:59.633175 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:59.633149 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:16:59.692018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:16:59.691981 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68b7bd9ff5-kh7mv"] Apr 17 21:17:02.610514 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:02.610486 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4d6rl" Apr 17 21:17:26.653317 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.653252 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68b7bd9ff5-kh7mv" podUID="4ef96d27-a7d8-4bd4-ba1d-055b1da95988" containerName="console" containerID="cri-o://7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab" gracePeriod=15 Apr 17 21:17:26.905652 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.905594 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68b7bd9ff5-kh7mv_4ef96d27-a7d8-4bd4-ba1d-055b1da95988/console/0.log" Apr 17 21:17:26.905768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.905657 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:17:26.993817 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.993777 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-oauth-config\") pod \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " Apr 17 21:17:26.993965 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.993839 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-serving-cert\") pod \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " Apr 17 21:17:26.993965 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.993874 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-config\") pod \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " Apr 17 21:17:26.993965 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.993917 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4sw4\" (UniqueName: \"kubernetes.io/projected/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-kube-api-access-r4sw4\") pod \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " Apr 17 21:17:26.993965 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.993943 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-oauth-serving-cert\") pod \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " Apr 17 21:17:26.994189 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.993972 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-service-ca\") pod \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\" (UID: \"4ef96d27-a7d8-4bd4-ba1d-055b1da95988\") " Apr 17 21:17:26.994419 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.994378 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-config" (OuterVolumeSpecName: "console-config") pod "4ef96d27-a7d8-4bd4-ba1d-055b1da95988" (UID: "4ef96d27-a7d8-4bd4-ba1d-055b1da95988"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:26.994419 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.994406 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4ef96d27-a7d8-4bd4-ba1d-055b1da95988" (UID: "4ef96d27-a7d8-4bd4-ba1d-055b1da95988"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:26.994609 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.994430 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-service-ca" (OuterVolumeSpecName: "service-ca") pod "4ef96d27-a7d8-4bd4-ba1d-055b1da95988" (UID: "4ef96d27-a7d8-4bd4-ba1d-055b1da95988"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:26.996193 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.996173 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-kube-api-access-r4sw4" (OuterVolumeSpecName: "kube-api-access-r4sw4") pod "4ef96d27-a7d8-4bd4-ba1d-055b1da95988" (UID: "4ef96d27-a7d8-4bd4-ba1d-055b1da95988"). InnerVolumeSpecName "kube-api-access-r4sw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:17:26.996263 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.996184 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4ef96d27-a7d8-4bd4-ba1d-055b1da95988" (UID: "4ef96d27-a7d8-4bd4-ba1d-055b1da95988"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:26.996310 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:26.996257 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4ef96d27-a7d8-4bd4-ba1d-055b1da95988" (UID: "4ef96d27-a7d8-4bd4-ba1d-055b1da95988"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:27.095121 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.095088 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-oauth-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:27.095121 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.095116 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-serving-cert\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:27.095121 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.095126 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-console-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:27.095342 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.095139 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4sw4\" (UniqueName: \"kubernetes.io/projected/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-kube-api-access-r4sw4\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:27.095342 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.095149 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-oauth-serving-cert\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:27.095342 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.095160 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ef96d27-a7d8-4bd4-ba1d-055b1da95988-service-ca\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:27.711277 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.711251 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68b7bd9ff5-kh7mv_4ef96d27-a7d8-4bd4-ba1d-055b1da95988/console/0.log" Apr 17 21:17:27.711717 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.711294 2567 generic.go:358] "Generic (PLEG): container finished" podID="4ef96d27-a7d8-4bd4-ba1d-055b1da95988" containerID="7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab" exitCode=2 Apr 17 21:17:27.711717 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.711358 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b7bd9ff5-kh7mv" event={"ID":"4ef96d27-a7d8-4bd4-ba1d-055b1da95988","Type":"ContainerDied","Data":"7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab"} Apr 17 21:17:27.711717 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.711367 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b7bd9ff5-kh7mv" Apr 17 21:17:27.711717 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.711385 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b7bd9ff5-kh7mv" event={"ID":"4ef96d27-a7d8-4bd4-ba1d-055b1da95988","Type":"ContainerDied","Data":"626886a1f22c52b0c25fa9cf3d0fe2577589e3ad3e2b183320286ef05fa7b9ad"} Apr 17 21:17:27.711717 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.711400 2567 scope.go:117] "RemoveContainer" containerID="7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab" Apr 17 21:17:27.719767 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.719745 2567 scope.go:117] "RemoveContainer" containerID="7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab" Apr 17 21:17:27.720051 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:17:27.720032 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab\": container with ID starting with 7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab not found: ID does not exist" containerID="7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab" Apr 17 21:17:27.720095 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.720061 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab"} err="failed to get container status \"7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab\": rpc error: code = NotFound desc = could not find container \"7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab\": container with ID starting with 7b5b2cd2e9fdc2eb6160e39fa7b2601ea4dd8a2f11b1c6c61424dea5edeba2ab not found: ID does not exist" Apr 17 21:17:27.727665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.727639 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68b7bd9ff5-kh7mv"] Apr 17 21:17:27.731681 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:27.731658 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68b7bd9ff5-kh7mv"] Apr 17 21:17:29.054384 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:29.054348 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef96d27-a7d8-4bd4-ba1d-055b1da95988" path="/var/lib/kubelet/pods/4ef96d27-a7d8-4bd4-ba1d-055b1da95988/volumes" Apr 17 21:17:33.152708 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.152671 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:17:33.153211 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.153161 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="alertmanager" containerID="cri-o://7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b" gracePeriod=120 Apr 17 21:17:33.153280 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.153194 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy-metric" containerID="cri-o://f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94" gracePeriod=120 Apr 17 21:17:33.153280 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.153204 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy-web" containerID="cri-o://e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9" gracePeriod=120 Apr 17 21:17:33.153280 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.153243 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="prom-label-proxy" containerID="cri-o://ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad" gracePeriod=120 Apr 17 21:17:33.153280 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.153228 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy" containerID="cri-o://0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8" gracePeriod=120 Apr 17 21:17:33.153280 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.153214 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="config-reloader" containerID="cri-o://59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75" gracePeriod=120 Apr 17 21:17:33.733466 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.733420 2567 generic.go:358] "Generic (PLEG): container finished" podID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerID="ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad" exitCode=0 Apr 17 21:17:33.733466 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.733453 2567 generic.go:358] "Generic (PLEG): container finished" podID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerID="0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8" exitCode=0 Apr 17 21:17:33.733466 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.733460 2567 generic.go:358] "Generic (PLEG): container finished" podID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerID="59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75" exitCode=0 Apr 17 21:17:33.733466 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.733466 2567 generic.go:358] "Generic (PLEG): container finished" podID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerID="7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b" exitCode=0 Apr 17 21:17:33.733736 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.733491 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerDied","Data":"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad"} Apr 17 21:17:33.733736 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.733547 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerDied","Data":"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8"} Apr 17 21:17:33.733736 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.733558 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerDied","Data":"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75"} Apr 17 21:17:33.733736 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:33.733567 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerDied","Data":"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b"} Apr 17 21:17:34.397251 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.397228 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.456007 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.455964 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-web\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456007 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456009 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456030 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-trusted-ca-bundle\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456046 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-metrics-client-ca\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456077 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456096 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-out\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456116 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-web-config\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456144 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-main-tls\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456181 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-volume\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456214 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-cluster-tls-config\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456664 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456257 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-tls-assets\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456664 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456300 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dtjs\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-kube-api-access-6dtjs\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456664 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456332 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-main-db\") pod \"65b4b530-d97b-4a1f-af94-60a53cc3202d\" (UID: \"65b4b530-d97b-4a1f-af94-60a53cc3202d\") " Apr 17 21:17:34.456822 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456798 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:34.456878 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456851 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:17:34.456934 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.456859 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:17:34.460038 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.459975 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:34.461309 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.460888 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-out" (OuterVolumeSpecName: "config-out") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:17:34.461309 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.460912 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:34.461309 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.460937 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:34.461309 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.460972 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:17:34.461309 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.461094 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-kube-api-access-6dtjs" (OuterVolumeSpecName: "kube-api-access-6dtjs") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "kube-api-access-6dtjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:17:34.461691 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.461337 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:34.461744 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.461702 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-volume" (OuterVolumeSpecName: "config-volume") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:34.464187 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.464070 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:34.471630 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.471601 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-web-config" (OuterVolumeSpecName: "web-config") pod "65b4b530-d97b-4a1f-af94-60a53cc3202d" (UID: "65b4b530-d97b-4a1f-af94-60a53cc3202d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:17:34.557164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557070 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557102 2567 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-out\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557116 2567 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-web-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557127 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-main-tls\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557138 2567 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-config-volume\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557152 2567 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-cluster-tls-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557164 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557164 2567 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-tls-assets\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557176 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dtjs\" (UniqueName: \"kubernetes.io/projected/65b4b530-d97b-4a1f-af94-60a53cc3202d-kube-api-access-6dtjs\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557188 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-main-db\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557200 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557211 2567 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65b4b530-d97b-4a1f-af94-60a53cc3202d-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557224 2567 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.557502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.557237 2567 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65b4b530-d97b-4a1f-af94-60a53cc3202d-metrics-client-ca\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:17:34.738885 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.738850 2567 generic.go:358] "Generic (PLEG): container finished" podID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerID="f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94" exitCode=0 Apr 17 21:17:34.738885 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.738877 2567 generic.go:358] "Generic (PLEG): container finished" podID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerID="e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9" exitCode=0 Apr 17 21:17:34.739103 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.738932 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerDied","Data":"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94"} Apr 17 21:17:34.739103 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.738965 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.739103 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.738982 2567 scope.go:117] "RemoveContainer" containerID="ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad" Apr 17 21:17:34.739103 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.738972 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerDied","Data":"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9"} Apr 17 21:17:34.739261 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.739121 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65b4b530-d97b-4a1f-af94-60a53cc3202d","Type":"ContainerDied","Data":"75f86ffc9d3d0a874841094eee5cfda0e7645d75ea83a84797a68688995a0471"} Apr 17 21:17:34.746745 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.746700 2567 scope.go:117] "RemoveContainer" containerID="f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94" Apr 17 21:17:34.753926 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.753909 2567 scope.go:117] "RemoveContainer" containerID="0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8" Apr 17 21:17:34.760655 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.760637 2567 scope.go:117] "RemoveContainer" containerID="e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9" Apr 17 21:17:34.762714 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.762693 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:17:34.766489 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.766469 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:17:34.768459 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.768445 2567 scope.go:117] "RemoveContainer" containerID="59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75" Apr 17 21:17:34.774910 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.774893 2567 scope.go:117] "RemoveContainer" containerID="7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b" Apr 17 21:17:34.781663 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.781646 2567 scope.go:117] "RemoveContainer" containerID="c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20" Apr 17 21:17:34.788255 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.788233 2567 scope.go:117] "RemoveContainer" containerID="ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad" Apr 17 21:17:34.788631 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:17:34.788607 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad\": container with ID starting with ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad not found: ID does not exist" containerID="ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad" Apr 17 21:17:34.788722 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.788639 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad"} err="failed to get container status \"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad\": rpc error: code = NotFound desc = could not find container \"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad\": container with ID starting with ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad not found: ID does not exist" Apr 17 21:17:34.788722 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.788661 2567 scope.go:117] "RemoveContainer" containerID="f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94" Apr 17 21:17:34.788952 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:17:34.788932 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94\": container with ID starting with f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94 not found: ID does not exist" containerID="f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94" Apr 17 21:17:34.789023 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.788974 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94"} err="failed to get container status \"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94\": rpc error: code = NotFound desc = could not find container \"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94\": container with ID starting with f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94 not found: ID does not exist" Apr 17 21:17:34.789023 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.788992 2567 scope.go:117] "RemoveContainer" containerID="0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8" Apr 17 21:17:34.789642 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:17:34.789409 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8\": container with ID starting with 0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8 not found: ID does not exist" containerID="0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8" Apr 17 21:17:34.789775 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.789658 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8"} err="failed to get container status \"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8\": rpc error: code = NotFound desc = could not find container \"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8\": container with ID starting with 0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8 not found: ID does not exist" Apr 17 21:17:34.789775 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.789685 2567 scope.go:117] "RemoveContainer" containerID="e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.790804 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791398 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy-metric" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791430 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy-metric" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791445 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="config-reloader" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791453 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="config-reloader" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791462 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791470 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791491 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="alertmanager" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791499 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="alertmanager" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791509 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy-web" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791533 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy-web" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791553 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ef96d27-a7d8-4bd4-ba1d-055b1da95988" containerName="console" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791563 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef96d27-a7d8-4bd4-ba1d-055b1da95988" containerName="console" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791573 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="prom-label-proxy" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791581 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="prom-label-proxy" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791606 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="init-config-reloader" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791614 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="init-config-reloader" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791750 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="alertmanager" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791762 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791772 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ef96d27-a7d8-4bd4-ba1d-055b1da95988" containerName="console" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:17:34.791780 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9\": container with ID starting with e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9 not found: ID does not exist" containerID="e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791815 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9"} err="failed to get container status \"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9\": rpc error: code = NotFound desc = could not find container \"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9\": container with ID starting with e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9 not found: ID does not exist" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791842 2567 scope.go:117] "RemoveContainer" containerID="59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791788 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="config-reloader" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791906 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy-web" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791931 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="kube-rbac-proxy-metric" Apr 17 21:17:34.794555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.791943 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" containerName="prom-label-proxy" Apr 17 21:17:34.795689 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:17:34.795552 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75\": container with ID starting with 59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75 not found: ID does not exist" containerID="59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75" Apr 17 21:17:34.795689 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.795583 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75"} err="failed to get container status \"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75\": rpc error: code = NotFound desc = could not find container \"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75\": container with ID starting with 59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75 not found: ID does not exist" Apr 17 21:17:34.795689 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.795598 2567 scope.go:117] "RemoveContainer" containerID="7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b" Apr 17 21:17:34.795902 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:17:34.795880 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b\": container with ID starting with 7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b not found: ID does not exist" containerID="7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b" Apr 17 21:17:34.795946 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.795910 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b"} err="failed to get container status \"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b\": rpc error: code = NotFound desc = could not find container \"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b\": container with ID starting with 7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b not found: ID does not exist" Apr 17 21:17:34.795946 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.795930 2567 scope.go:117] "RemoveContainer" containerID="c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20" Apr 17 21:17:34.796208 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:17:34.796190 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20\": container with ID starting with c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20 not found: ID does not exist" containerID="c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20" Apr 17 21:17:34.796262 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.796211 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20"} err="failed to get container status \"c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20\": rpc error: code = NotFound desc = could not find container \"c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20\": container with ID starting with c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20 not found: ID does not exist" Apr 17 21:17:34.796262 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.796226 2567 scope.go:117] "RemoveContainer" containerID="ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad" Apr 17 21:17:34.796457 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.796440 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad"} err="failed to get container status \"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad\": rpc error: code = NotFound desc = could not find container \"ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad\": container with ID starting with ca96dd672c39b7c55ce04be7ccc05bf9dfd4cbab77f7eddac3099bc01f7d1dad not found: ID does not exist" Apr 17 21:17:34.796497 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.796457 2567 scope.go:117] "RemoveContainer" containerID="f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94" Apr 17 21:17:34.796693 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.796674 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94"} err="failed to get container status \"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94\": rpc error: code = NotFound desc = could not find container \"f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94\": container with ID starting with f0cbe4cc39d92eb98dc2f77bd53007f68316d0c3869360cea04de6776e199e94 not found: ID does not exist" Apr 17 21:17:34.796768 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.796696 2567 scope.go:117] "RemoveContainer" containerID="0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8" Apr 17 21:17:34.796969 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.796932 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8"} err="failed to get container status \"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8\": rpc error: code = NotFound desc = could not find container \"0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8\": container with ID starting with 0e0140562197928bd85cd10561115dbe24d213b9754fbac2c78d7b8530563fc8 not found: ID does not exist" Apr 17 21:17:34.797038 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.796983 2567 scope.go:117] "RemoveContainer" containerID="e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9" Apr 17 21:17:34.797347 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.797322 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9"} err="failed to get container status \"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9\": rpc error: code = NotFound desc = could not find container \"e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9\": container with ID starting with e35ec64ab4943d6e4a276bb3ccb33181de88922d2455c69bd99042c3d7d8f3d9 not found: ID does not exist" Apr 17 21:17:34.797411 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.797361 2567 scope.go:117] "RemoveContainer" containerID="59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75" Apr 17 21:17:34.797639 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.797616 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75"} err="failed to get container status \"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75\": rpc error: code = NotFound desc = could not find container \"59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75\": container with ID starting with 59b9053519aadf815e8864b1891e09b1710bd0da05f4d1d5ee6b476b87e26d75 not found: ID does not exist" Apr 17 21:17:34.797639 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.797638 2567 scope.go:117] "RemoveContainer" containerID="7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b" Apr 17 21:17:34.797812 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.797794 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b"} err="failed to get container status \"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b\": rpc error: code = NotFound desc = could not find container \"7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b\": container with ID starting with 7024d40acde6d7403e1b9346b7452a5efe82c4e77f9a2326ca2610bd9033742b not found: ID does not exist" Apr 17 21:17:34.797857 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.797813 2567 scope.go:117] "RemoveContainer" containerID="c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20" Apr 17 21:17:34.798027 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.798008 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20"} err="failed to get container status \"c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20\": rpc error: code = NotFound desc = could not find container \"c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20\": container with ID starting with c703626f5ceb0bb7716f4b5846aaa16720367395310d5351d9abfdcd76df4f20 not found: ID does not exist" Apr 17 21:17:34.798771 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.798757 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.801236 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801218 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 21:17:34.801339 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801270 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 21:17:34.801498 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801478 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m7n8s\"" Apr 17 21:17:34.801498 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801492 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 21:17:34.801663 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801606 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 21:17:34.801857 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801841 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 21:17:34.801924 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801842 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 21:17:34.801924 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801847 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 21:17:34.802041 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.801935 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 21:17:34.804593 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.804571 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:17:34.807219 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.807136 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 21:17:34.859301 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859265 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859487 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859307 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859487 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859359 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-config-out\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859487 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859408 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-web-config\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859487 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859446 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859696 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859512 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859696 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859559 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859696 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859618 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kkrl\" (UniqueName: \"kubernetes.io/projected/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-kube-api-access-9kkrl\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859696 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859646 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859696 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859659 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859696 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859688 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859714 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.859902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.859730 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.960838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.960796 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kkrl\" (UniqueName: \"kubernetes.io/projected/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-kube-api-access-9kkrl\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.960838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.960845 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.960863 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.960886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.960911 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.960928 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.961092 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961334 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.961132 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961334 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.961163 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-config-out\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961334 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.961321 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.961947 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.961921 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-web-config\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.962068 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.961951 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.962068 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.961976 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.962068 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.962040 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.962222 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.962076 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.963935 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.963813 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-config-out\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.964171 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.964143 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-config-volume\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.964171 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.964164 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.964296 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.964165 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.964492 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.964467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-web-config\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.964613 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.964477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.964678 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.964654 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.964736 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.964677 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.964736 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.964726 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.965952 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.965928 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:34.968708 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:34.968687 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kkrl\" (UniqueName: \"kubernetes.io/projected/fc92d74d-3f8d-47c4-9c67-8d411bad18a3-kube-api-access-9kkrl\") pod \"alertmanager-main-0\" (UID: \"fc92d74d-3f8d-47c4-9c67-8d411bad18a3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:35.055034 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:35.054997 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65b4b530-d97b-4a1f-af94-60a53cc3202d" path="/var/lib/kubelet/pods/65b4b530-d97b-4a1f-af94-60a53cc3202d/volumes" Apr 17 21:17:35.110221 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:35.110184 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 21:17:35.238678 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:35.238645 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 21:17:35.241286 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:17:35.241257 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc92d74d_3f8d_47c4_9c67_8d411bad18a3.slice/crio-57a0d58b615e312486da7e1f0d125eda7920b47fac62b3a55647b678c517ff4d WatchSource:0}: Error finding container 57a0d58b615e312486da7e1f0d125eda7920b47fac62b3a55647b678c517ff4d: Status 404 returned error can't find the container with id 57a0d58b615e312486da7e1f0d125eda7920b47fac62b3a55647b678c517ff4d Apr 17 21:17:35.744364 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:35.744324 2567 generic.go:358] "Generic (PLEG): container finished" podID="fc92d74d-3f8d-47c4-9c67-8d411bad18a3" containerID="7e58d453ce0b4dd3dd05cdc76e9cf71a6e15fbd169f37c22eb0d38f8db1877c8" exitCode=0 Apr 17 21:17:35.744728 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:35.744411 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fc92d74d-3f8d-47c4-9c67-8d411bad18a3","Type":"ContainerDied","Data":"7e58d453ce0b4dd3dd05cdc76e9cf71a6e15fbd169f37c22eb0d38f8db1877c8"} Apr 17 21:17:35.744728 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:35.744445 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fc92d74d-3f8d-47c4-9c67-8d411bad18a3","Type":"ContainerStarted","Data":"57a0d58b615e312486da7e1f0d125eda7920b47fac62b3a55647b678c517ff4d"} Apr 17 21:17:36.751496 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:36.751459 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fc92d74d-3f8d-47c4-9c67-8d411bad18a3","Type":"ContainerStarted","Data":"276790ba7a51172c1f753cf5ecea2513d8dfe83f076c31fc390fd5d6fe853891"} Apr 17 21:17:36.751496 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:36.751498 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fc92d74d-3f8d-47c4-9c67-8d411bad18a3","Type":"ContainerStarted","Data":"66a2371a3ac26fb8ce58cf7341151859be36f079039a69ca571dc35fe1170c8c"} Apr 17 21:17:36.752040 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:36.751511 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fc92d74d-3f8d-47c4-9c67-8d411bad18a3","Type":"ContainerStarted","Data":"3ecdf672dad86413dcb8975cba2d9c96fa471ea5e005b42b9174300aa03f8233"} Apr 17 21:17:36.752040 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:36.751538 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fc92d74d-3f8d-47c4-9c67-8d411bad18a3","Type":"ContainerStarted","Data":"296355af39bd2ff1ec7dd67cbc7b0ce51a42cca8fd8c92c5f5afedcfe4efcf6f"} Apr 17 21:17:36.752040 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:36.751553 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fc92d74d-3f8d-47c4-9c67-8d411bad18a3","Type":"ContainerStarted","Data":"063ad6f8a15560cfb9fb7514d69436a132b5afc5f7c9f9e0894a0c17423477f2"} Apr 17 21:17:36.752040 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:36.751563 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fc92d74d-3f8d-47c4-9c67-8d411bad18a3","Type":"ContainerStarted","Data":"de22976861016a75996909c077f776787936eaf1f8a25ce10d81e65c50cf3595"} Apr 17 21:17:36.778392 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:36.778329 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.778310548 podStartE2EDuration="2.778310548s" podCreationTimestamp="2026-04-17 21:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:17:36.776094808 +0000 UTC m=+212.345607572" watchObservedRunningTime="2026-04-17 21:17:36.778310548 +0000 UTC m=+212.347823301" Apr 17 21:17:37.178392 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.178354 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-85cd5dfb97-ddpld"] Apr 17 21:17:37.182405 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.182376 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.184752 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.184723 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 21:17:37.184874 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.184771 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 21:17:37.184874 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.184724 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-lwt27\"" Apr 17 21:17:37.184874 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.184861 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 21:17:37.185055 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.184999 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 21:17:37.185176 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.185159 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 21:17:37.189272 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.189250 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 21:17:37.193896 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.193876 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-85cd5dfb97-ddpld"] Apr 17 21:17:37.284073 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.284028 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvd4f\" (UniqueName: \"kubernetes.io/projected/075cee51-892e-4c92-90cb-bc43f3a6c219-kube-api-access-bvd4f\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.284254 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.284086 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-serving-certs-ca-bundle\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.284254 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.284146 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-secret-telemeter-client\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.284254 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.284189 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-federate-client-tls\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.284254 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.284225 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.284254 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.284241 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-metrics-client-ca\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.284412 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.284259 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-telemeter-client-tls\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.284412 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.284312 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.385709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.385655 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.385794 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.385722 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-metrics-client-ca\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.385794 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.385759 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-telemeter-client-tls\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.385873 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.385799 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.385873 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.385837 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvd4f\" (UniqueName: \"kubernetes.io/projected/075cee51-892e-4c92-90cb-bc43f3a6c219-kube-api-access-bvd4f\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.385928 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.385892 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-serving-certs-ca-bundle\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.385966 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.385929 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-secret-telemeter-client\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.386010 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.385968 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-federate-client-tls\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.386591 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.386562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-serving-certs-ca-bundle\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.386720 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.386562 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-metrics-client-ca\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.386873 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.386846 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/075cee51-892e-4c92-90cb-bc43f3a6c219-telemeter-trusted-ca-bundle\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.388502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.388477 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.388629 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.388578 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-federate-client-tls\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.388713 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.388693 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-secret-telemeter-client\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.388764 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.388746 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/075cee51-892e-4c92-90cb-bc43f3a6c219-telemeter-client-tls\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.394983 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.394947 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvd4f\" (UniqueName: \"kubernetes.io/projected/075cee51-892e-4c92-90cb-bc43f3a6c219-kube-api-access-bvd4f\") pod \"telemeter-client-85cd5dfb97-ddpld\" (UID: \"075cee51-892e-4c92-90cb-bc43f3a6c219\") " pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.493428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.493339 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" Apr 17 21:17:37.615560 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.615343 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-85cd5dfb97-ddpld"] Apr 17 21:17:37.617996 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:17:37.617968 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod075cee51_892e_4c92_90cb_bc43f3a6c219.slice/crio-87460ab9c510ca36bb222646db2e95be3520127244c62162d89c7f3524bd62f4 WatchSource:0}: Error finding container 87460ab9c510ca36bb222646db2e95be3520127244c62162d89c7f3524bd62f4: Status 404 returned error can't find the container with id 87460ab9c510ca36bb222646db2e95be3520127244c62162d89c7f3524bd62f4 Apr 17 21:17:37.756764 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:37.756681 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" event={"ID":"075cee51-892e-4c92-90cb-bc43f3a6c219","Type":"ContainerStarted","Data":"87460ab9c510ca36bb222646db2e95be3520127244c62162d89c7f3524bd62f4"} Apr 17 21:17:39.774721 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:39.774690 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" event={"ID":"075cee51-892e-4c92-90cb-bc43f3a6c219","Type":"ContainerStarted","Data":"aff8b52f69b64d397b0d2cc14a251ea9333d72cf043c31fc3d810908f793bb23"} Apr 17 21:17:39.774721 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:39.774725 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" event={"ID":"075cee51-892e-4c92-90cb-bc43f3a6c219","Type":"ContainerStarted","Data":"2a624a57564a3123fd1f4111ec8e7d56b94ea831085ac440176d22f808e40d5e"} Apr 17 21:17:40.779170 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:40.779132 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" event={"ID":"075cee51-892e-4c92-90cb-bc43f3a6c219","Type":"ContainerStarted","Data":"0d31e07d7b627d2989cde135a02681298db03ddc2aeee09a69759d5d7c6c2008"} Apr 17 21:17:40.800919 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:40.800872 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-85cd5dfb97-ddpld" podStartSLOduration=1.7933057319999999 podStartE2EDuration="3.800855938s" podCreationTimestamp="2026-04-17 21:17:37 +0000 UTC" firstStartedPulling="2026-04-17 21:17:37.619864482 +0000 UTC m=+213.189377214" lastFinishedPulling="2026-04-17 21:17:39.627414686 +0000 UTC m=+215.196927420" observedRunningTime="2026-04-17 21:17:40.798370404 +0000 UTC m=+216.367883179" watchObservedRunningTime="2026-04-17 21:17:40.800855938 +0000 UTC m=+216.370368691" Apr 17 21:17:41.358218 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.358170 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8b6d7b5f4-4zq48"] Apr 17 21:17:41.361965 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.361940 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.372039 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.371870 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8b6d7b5f4-4zq48"] Apr 17 21:17:41.424413 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.424380 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-config\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.424413 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.424416 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-service-ca\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.424646 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.424500 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-oauth-serving-cert\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.424646 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.424537 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srv99\" (UniqueName: \"kubernetes.io/projected/474f3e92-6282-474e-8a1d-6ea06db92e0e-kube-api-access-srv99\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.424646 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.424559 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-serving-cert\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.424749 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.424653 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-oauth-config\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.424749 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.424685 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-trusted-ca-bundle\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.525484 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.525442 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-config\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.525484 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.525485 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-service-ca\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.525709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.525547 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-oauth-serving-cert\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.525709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.525564 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srv99\" (UniqueName: \"kubernetes.io/projected/474f3e92-6282-474e-8a1d-6ea06db92e0e-kube-api-access-srv99\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.525709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.525583 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-serving-cert\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.525709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.525616 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-oauth-config\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.525709 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.525633 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-trusted-ca-bundle\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.526294 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.526257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-service-ca\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.526407 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.526346 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-oauth-serving-cert\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.526485 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.526467 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-trusted-ca-bundle\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.527922 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.527900 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-config\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.528007 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.527980 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-oauth-config\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.528223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.528202 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-serving-cert\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.532555 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.532510 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srv99\" (UniqueName: \"kubernetes.io/projected/474f3e92-6282-474e-8a1d-6ea06db92e0e-kube-api-access-srv99\") pod \"console-8b6d7b5f4-4zq48\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.673329 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.673237 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:41.794888 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:41.794864 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8b6d7b5f4-4zq48"] Apr 17 21:17:41.797321 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:17:41.797286 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474f3e92_6282_474e_8a1d_6ea06db92e0e.slice/crio-36f9b18208274f8422d8685ffcec2687c491a0ee50b30ee9fee40b6ad96e31bb WatchSource:0}: Error finding container 36f9b18208274f8422d8685ffcec2687c491a0ee50b30ee9fee40b6ad96e31bb: Status 404 returned error can't find the container with id 36f9b18208274f8422d8685ffcec2687c491a0ee50b30ee9fee40b6ad96e31bb Apr 17 21:17:42.787112 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:42.787069 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b6d7b5f4-4zq48" event={"ID":"474f3e92-6282-474e-8a1d-6ea06db92e0e","Type":"ContainerStarted","Data":"8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018"} Apr 17 21:17:42.787112 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:42.787114 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b6d7b5f4-4zq48" event={"ID":"474f3e92-6282-474e-8a1d-6ea06db92e0e","Type":"ContainerStarted","Data":"36f9b18208274f8422d8685ffcec2687c491a0ee50b30ee9fee40b6ad96e31bb"} Apr 17 21:17:42.803968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:42.803919 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8b6d7b5f4-4zq48" podStartSLOduration=1.803905447 podStartE2EDuration="1.803905447s" podCreationTimestamp="2026-04-17 21:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:17:42.802628956 +0000 UTC m=+218.372141733" watchObservedRunningTime="2026-04-17 21:17:42.803905447 +0000 UTC m=+218.373418200" Apr 17 21:17:51.674397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:51.674359 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:51.674397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:51.674406 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:51.679032 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:51.679007 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:51.819173 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:51.819144 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:17:51.865030 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:17:51.864999 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bfcd69665-nz7jn"] Apr 17 21:18:16.891726 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:16.891668 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bfcd69665-nz7jn" podUID="215e7a1f-e930-47c1-80f1-4820191852ae" containerName="console" containerID="cri-o://bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b" gracePeriod=15 Apr 17 21:18:17.129493 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.129471 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bfcd69665-nz7jn_215e7a1f-e930-47c1-80f1-4820191852ae/console/0.log" Apr 17 21:18:17.129636 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.129555 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:18:17.226447 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226355 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-serving-cert\") pod \"215e7a1f-e930-47c1-80f1-4820191852ae\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " Apr 17 21:18:17.226447 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226414 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-trusted-ca-bundle\") pod \"215e7a1f-e930-47c1-80f1-4820191852ae\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " Apr 17 21:18:17.226447 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226441 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-oauth-serving-cert\") pod \"215e7a1f-e930-47c1-80f1-4820191852ae\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " Apr 17 21:18:17.226754 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226468 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-console-config\") pod \"215e7a1f-e930-47c1-80f1-4820191852ae\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " Apr 17 21:18:17.226754 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226505 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp8qv\" (UniqueName: \"kubernetes.io/projected/215e7a1f-e930-47c1-80f1-4820191852ae-kube-api-access-qp8qv\") pod \"215e7a1f-e930-47c1-80f1-4820191852ae\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " Apr 17 21:18:17.226754 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226614 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-oauth-config\") pod \"215e7a1f-e930-47c1-80f1-4820191852ae\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " Apr 17 21:18:17.226754 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226645 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-service-ca\") pod \"215e7a1f-e930-47c1-80f1-4820191852ae\" (UID: \"215e7a1f-e930-47c1-80f1-4820191852ae\") " Apr 17 21:18:17.227069 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226956 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "215e7a1f-e930-47c1-80f1-4820191852ae" (UID: "215e7a1f-e930-47c1-80f1-4820191852ae"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:18:17.227069 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226963 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-console-config" (OuterVolumeSpecName: "console-config") pod "215e7a1f-e930-47c1-80f1-4820191852ae" (UID: "215e7a1f-e930-47c1-80f1-4820191852ae"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:18:17.227069 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.226971 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "215e7a1f-e930-47c1-80f1-4820191852ae" (UID: "215e7a1f-e930-47c1-80f1-4820191852ae"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:18:17.227259 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.227190 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-service-ca" (OuterVolumeSpecName: "service-ca") pod "215e7a1f-e930-47c1-80f1-4820191852ae" (UID: "215e7a1f-e930-47c1-80f1-4820191852ae"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:18:17.228723 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.228694 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "215e7a1f-e930-47c1-80f1-4820191852ae" (UID: "215e7a1f-e930-47c1-80f1-4820191852ae"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:18:17.228809 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.228717 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "215e7a1f-e930-47c1-80f1-4820191852ae" (UID: "215e7a1f-e930-47c1-80f1-4820191852ae"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:18:17.228809 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.228739 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/215e7a1f-e930-47c1-80f1-4820191852ae-kube-api-access-qp8qv" (OuterVolumeSpecName: "kube-api-access-qp8qv") pod "215e7a1f-e930-47c1-80f1-4820191852ae" (UID: "215e7a1f-e930-47c1-80f1-4820191852ae"). InnerVolumeSpecName "kube-api-access-qp8qv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:18:17.327615 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.327579 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-oauth-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:18:17.327615 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.327609 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-service-ca\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:18:17.327615 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.327619 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/215e7a1f-e930-47c1-80f1-4820191852ae-console-serving-cert\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:18:17.327615 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.327628 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-trusted-ca-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:18:17.327872 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.327638 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-oauth-serving-cert\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:18:17.327872 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.327647 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/215e7a1f-e930-47c1-80f1-4820191852ae-console-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:18:17.327872 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.327657 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qp8qv\" (UniqueName: \"kubernetes.io/projected/215e7a1f-e930-47c1-80f1-4820191852ae-kube-api-access-qp8qv\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:18:17.892355 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.892328 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bfcd69665-nz7jn_215e7a1f-e930-47c1-80f1-4820191852ae/console/0.log" Apr 17 21:18:17.892788 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.892368 2567 generic.go:358] "Generic (PLEG): container finished" podID="215e7a1f-e930-47c1-80f1-4820191852ae" containerID="bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b" exitCode=2 Apr 17 21:18:17.892788 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.892400 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfcd69665-nz7jn" event={"ID":"215e7a1f-e930-47c1-80f1-4820191852ae","Type":"ContainerDied","Data":"bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b"} Apr 17 21:18:17.892788 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.892432 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfcd69665-nz7jn" Apr 17 21:18:17.892788 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.892439 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfcd69665-nz7jn" event={"ID":"215e7a1f-e930-47c1-80f1-4820191852ae","Type":"ContainerDied","Data":"618ee2e431436e17be5909aa883d7d93cfce0de837c8a92d2a76d2eeea7325ef"} Apr 17 21:18:17.892788 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.892455 2567 scope.go:117] "RemoveContainer" containerID="bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b" Apr 17 21:18:17.905291 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.905272 2567 scope.go:117] "RemoveContainer" containerID="bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b" Apr 17 21:18:17.905609 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:18:17.905584 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b\": container with ID starting with bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b not found: ID does not exist" containerID="bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b" Apr 17 21:18:17.905704 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.905615 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b"} err="failed to get container status \"bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b\": rpc error: code = NotFound desc = could not find container \"bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b\": container with ID starting with bb8af16bc4dace89f588bb2f6113162f3e41bc24b3af3b542a1206a0624f316b not found: ID does not exist" Apr 17 21:18:17.916018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.915991 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bfcd69665-nz7jn"] Apr 17 21:18:17.917333 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:17.917311 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bfcd69665-nz7jn"] Apr 17 21:18:19.054133 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:18:19.054095 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="215e7a1f-e930-47c1-80f1-4820191852ae" path="/var/lib/kubelet/pods/215e7a1f-e930-47c1-80f1-4820191852ae/volumes" Apr 17 21:19:04.934974 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:04.934944 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:19:04.937193 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:04.937165 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:19:04.938598 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:04.938572 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:19:04.940770 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:04.940748 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:19:04.944346 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:04.944324 2567 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 21:19:28.644993 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.644959 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c6df6579f-w4rxx"] Apr 17 21:19:28.646912 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.645310 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="215e7a1f-e930-47c1-80f1-4820191852ae" containerName="console" Apr 17 21:19:28.646912 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.645322 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="215e7a1f-e930-47c1-80f1-4820191852ae" containerName="console" Apr 17 21:19:28.646912 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.645384 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="215e7a1f-e930-47c1-80f1-4820191852ae" containerName="console" Apr 17 21:19:28.647588 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.647563 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.656834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.656807 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6df6579f-w4rxx"] Apr 17 21:19:28.730589 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.730557 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-oauth-serving-cert\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.730589 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.730594 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gktln\" (UniqueName: \"kubernetes.io/projected/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-kube-api-access-gktln\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.730819 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.730615 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-serving-cert\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.730819 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.730685 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-trusted-ca-bundle\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.730819 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.730747 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-config\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.730819 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.730763 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-service-ca\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.731011 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.730831 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-oauth-config\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.831217 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.831177 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-oauth-serving-cert\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.831217 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.831217 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gktln\" (UniqueName: \"kubernetes.io/projected/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-kube-api-access-gktln\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.831474 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.831235 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-serving-cert\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.831474 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.831266 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-trusted-ca-bundle\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.831474 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.831436 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-config\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.831673 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.831478 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-service-ca\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.831673 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.831603 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-oauth-config\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.832012 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.831986 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-oauth-serving-cert\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.832125 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.832107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-trusted-ca-bundle\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.832179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.832158 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-config\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.832264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.832242 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-service-ca\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.833918 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.833888 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-oauth-config\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.834022 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.833962 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-serving-cert\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.839180 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.839163 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gktln\" (UniqueName: \"kubernetes.io/projected/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-kube-api-access-gktln\") pod \"console-c6df6579f-w4rxx\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:28.957814 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:28.957725 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:29.078225 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:29.078156 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c6df6579f-w4rxx"] Apr 17 21:19:29.083156 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:19:29.083094 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f84fcf_5e32_4376_9ad9_4f5391a53cbf.slice/crio-8d684c36afeed793ba5d581641b030554d006ea1b8951435ccee2e94712d02b9 WatchSource:0}: Error finding container 8d684c36afeed793ba5d581641b030554d006ea1b8951435ccee2e94712d02b9: Status 404 returned error can't find the container with id 8d684c36afeed793ba5d581641b030554d006ea1b8951435ccee2e94712d02b9 Apr 17 21:19:29.084743 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:29.084727 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:19:29.095366 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:29.095333 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6df6579f-w4rxx" event={"ID":"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf","Type":"ContainerStarted","Data":"8d684c36afeed793ba5d581641b030554d006ea1b8951435ccee2e94712d02b9"} Apr 17 21:19:30.098922 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:30.098886 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6df6579f-w4rxx" event={"ID":"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf","Type":"ContainerStarted","Data":"09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995"} Apr 17 21:19:30.115557 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:30.115496 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c6df6579f-w4rxx" podStartSLOduration=2.115480397 podStartE2EDuration="2.115480397s" podCreationTimestamp="2026-04-17 21:19:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:19:30.114032282 +0000 UTC m=+325.683545035" watchObservedRunningTime="2026-04-17 21:19:30.115480397 +0000 UTC m=+325.684993149" Apr 17 21:19:38.958645 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:38.958610 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:38.959208 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:38.958656 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:38.963311 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:38.963285 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:39.129965 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.129940 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:19:39.167001 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.166970 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dsmfz"] Apr 17 21:19:39.169594 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.169573 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.172198 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.172177 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 21:19:39.179428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.179402 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dsmfz"] Apr 17 21:19:39.184148 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.184128 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8b6d7b5f4-4zq48"] Apr 17 21:19:39.326381 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.326295 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b51bcd52-26f7-423d-a48a-7a9ea687c5be-original-pull-secret\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.326381 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.326335 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b51bcd52-26f7-423d-a48a-7a9ea687c5be-kubelet-config\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.326381 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.326371 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b51bcd52-26f7-423d-a48a-7a9ea687c5be-dbus\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.427123 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.427082 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b51bcd52-26f7-423d-a48a-7a9ea687c5be-original-pull-secret\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.427123 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.427127 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b51bcd52-26f7-423d-a48a-7a9ea687c5be-kubelet-config\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.427319 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.427155 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b51bcd52-26f7-423d-a48a-7a9ea687c5be-dbus\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.427319 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.427222 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b51bcd52-26f7-423d-a48a-7a9ea687c5be-kubelet-config\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.427385 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.427316 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b51bcd52-26f7-423d-a48a-7a9ea687c5be-dbus\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.429349 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.429318 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b51bcd52-26f7-423d-a48a-7a9ea687c5be-original-pull-secret\") pod \"global-pull-secret-syncer-dsmfz\" (UID: \"b51bcd52-26f7-423d-a48a-7a9ea687c5be\") " pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.480499 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.480463 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dsmfz" Apr 17 21:19:39.598631 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:39.598496 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dsmfz"] Apr 17 21:19:39.601167 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:19:39.601142 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb51bcd52_26f7_423d_a48a_7a9ea687c5be.slice/crio-ea694e10edfe28fc02d207469f9f5c24e25e8cbf0976514ac047b32959ef8b65 WatchSource:0}: Error finding container ea694e10edfe28fc02d207469f9f5c24e25e8cbf0976514ac047b32959ef8b65: Status 404 returned error can't find the container with id ea694e10edfe28fc02d207469f9f5c24e25e8cbf0976514ac047b32959ef8b65 Apr 17 21:19:40.130243 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:40.130207 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dsmfz" event={"ID":"b51bcd52-26f7-423d-a48a-7a9ea687c5be","Type":"ContainerStarted","Data":"ea694e10edfe28fc02d207469f9f5c24e25e8cbf0976514ac047b32959ef8b65"} Apr 17 21:19:45.148506 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:45.148468 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dsmfz" event={"ID":"b51bcd52-26f7-423d-a48a-7a9ea687c5be","Type":"ContainerStarted","Data":"b71cb8ccd80bc1493119a462376b1af03c006f8c1d1db2d443bb29f4f6ee5607"} Apr 17 21:19:45.163216 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:45.163162 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dsmfz" podStartSLOduration=1.689972124 podStartE2EDuration="6.163145203s" podCreationTimestamp="2026-04-17 21:19:39 +0000 UTC" firstStartedPulling="2026-04-17 21:19:39.602820407 +0000 UTC m=+335.172333137" lastFinishedPulling="2026-04-17 21:19:44.075993486 +0000 UTC m=+339.645506216" observedRunningTime="2026-04-17 21:19:45.161604657 +0000 UTC m=+340.731117422" watchObservedRunningTime="2026-04-17 21:19:45.163145203 +0000 UTC m=+340.732657986" Apr 17 21:19:51.557298 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.557254 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8"] Apr 17 21:19:51.562274 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.562243 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.564770 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.564709 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:19:51.564909 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.564783 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:19:51.566061 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.566043 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-t66c9\"" Apr 17 21:19:51.567838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.567818 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8"] Apr 17 21:19:51.628197 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.628164 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.628401 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.628250 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.628401 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.628280 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skddw\" (UniqueName: \"kubernetes.io/projected/4638f788-25bd-4afc-998d-9b44dd2b28dd-kube-api-access-skddw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.728808 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.728772 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.729021 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.728831 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.729021 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.728852 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skddw\" (UniqueName: \"kubernetes.io/projected/4638f788-25bd-4afc-998d-9b44dd2b28dd-kube-api-access-skddw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.729192 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.729171 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.729229 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.729196 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.736903 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.736872 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skddw\" (UniqueName: \"kubernetes.io/projected/4638f788-25bd-4afc-998d-9b44dd2b28dd-kube-api-access-skddw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.872137 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.872110 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:19:51.992854 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:51.992687 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8"] Apr 17 21:19:51.995453 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:19:51.995424 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4638f788_25bd_4afc_998d_9b44dd2b28dd.slice/crio-35f814f8e6c612c961f17bf6f52f3055d6868608467dc6f45e9aa71b88bdded7 WatchSource:0}: Error finding container 35f814f8e6c612c961f17bf6f52f3055d6868608467dc6f45e9aa71b88bdded7: Status 404 returned error can't find the container with id 35f814f8e6c612c961f17bf6f52f3055d6868608467dc6f45e9aa71b88bdded7 Apr 17 21:19:52.170116 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:52.170025 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" event={"ID":"4638f788-25bd-4afc-998d-9b44dd2b28dd","Type":"ContainerStarted","Data":"35f814f8e6c612c961f17bf6f52f3055d6868608467dc6f45e9aa71b88bdded7"} Apr 17 21:19:57.195970 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:57.195926 2567 generic.go:358] "Generic (PLEG): container finished" podID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerID="31a4d3c7190f512dc9c95f53bee2ff495366a8541d2efd3accf83139c4e3a3c4" exitCode=0 Apr 17 21:19:57.196356 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:19:57.196014 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" event={"ID":"4638f788-25bd-4afc-998d-9b44dd2b28dd","Type":"ContainerDied","Data":"31a4d3c7190f512dc9c95f53bee2ff495366a8541d2efd3accf83139c4e3a3c4"} Apr 17 21:20:04.204295 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.204247 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8b6d7b5f4-4zq48" podUID="474f3e92-6282-474e-8a1d-6ea06db92e0e" containerName="console" containerID="cri-o://8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018" gracePeriod=15 Apr 17 21:20:04.217805 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.217779 2567 generic.go:358] "Generic (PLEG): container finished" podID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerID="0186a6098bb6b3c264fc10fed2c3c58f21410cf581b86f3c803a05a3825f2aa0" exitCode=0 Apr 17 21:20:04.217899 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.217854 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" event={"ID":"4638f788-25bd-4afc-998d-9b44dd2b28dd","Type":"ContainerDied","Data":"0186a6098bb6b3c264fc10fed2c3c58f21410cf581b86f3c803a05a3825f2aa0"} Apr 17 21:20:04.443437 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.443414 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8b6d7b5f4-4zq48_474f3e92-6282-474e-8a1d-6ea06db92e0e/console/0.log" Apr 17 21:20:04.443590 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.443477 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:20:04.545070 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.544975 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-oauth-serving-cert\") pod \"474f3e92-6282-474e-8a1d-6ea06db92e0e\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " Apr 17 21:20:04.545070 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545025 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-serving-cert\") pod \"474f3e92-6282-474e-8a1d-6ea06db92e0e\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " Apr 17 21:20:04.545070 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545067 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srv99\" (UniqueName: \"kubernetes.io/projected/474f3e92-6282-474e-8a1d-6ea06db92e0e-kube-api-access-srv99\") pod \"474f3e92-6282-474e-8a1d-6ea06db92e0e\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " Apr 17 21:20:04.545341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545101 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-oauth-config\") pod \"474f3e92-6282-474e-8a1d-6ea06db92e0e\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " Apr 17 21:20:04.545341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545145 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-trusted-ca-bundle\") pod \"474f3e92-6282-474e-8a1d-6ea06db92e0e\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " Apr 17 21:20:04.545341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545180 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-service-ca\") pod \"474f3e92-6282-474e-8a1d-6ea06db92e0e\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " Apr 17 21:20:04.545341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545210 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-config\") pod \"474f3e92-6282-474e-8a1d-6ea06db92e0e\" (UID: \"474f3e92-6282-474e-8a1d-6ea06db92e0e\") " Apr 17 21:20:04.545683 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545417 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "474f3e92-6282-474e-8a1d-6ea06db92e0e" (UID: "474f3e92-6282-474e-8a1d-6ea06db92e0e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:20:04.545746 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545688 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-service-ca" (OuterVolumeSpecName: "service-ca") pod "474f3e92-6282-474e-8a1d-6ea06db92e0e" (UID: "474f3e92-6282-474e-8a1d-6ea06db92e0e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:20:04.545881 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.545843 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "474f3e92-6282-474e-8a1d-6ea06db92e0e" (UID: "474f3e92-6282-474e-8a1d-6ea06db92e0e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:20:04.546066 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.546044 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-config" (OuterVolumeSpecName: "console-config") pod "474f3e92-6282-474e-8a1d-6ea06db92e0e" (UID: "474f3e92-6282-474e-8a1d-6ea06db92e0e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:20:04.546138 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.546106 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-oauth-serving-cert\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:04.546138 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.546122 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-trusted-ca-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:04.546138 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.546132 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-service-ca\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:04.546230 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.546140 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:04.547415 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.547395 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "474f3e92-6282-474e-8a1d-6ea06db92e0e" (UID: "474f3e92-6282-474e-8a1d-6ea06db92e0e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:20:04.547828 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.547792 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "474f3e92-6282-474e-8a1d-6ea06db92e0e" (UID: "474f3e92-6282-474e-8a1d-6ea06db92e0e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:20:04.547911 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.547849 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474f3e92-6282-474e-8a1d-6ea06db92e0e-kube-api-access-srv99" (OuterVolumeSpecName: "kube-api-access-srv99") pod "474f3e92-6282-474e-8a1d-6ea06db92e0e" (UID: "474f3e92-6282-474e-8a1d-6ea06db92e0e"). InnerVolumeSpecName "kube-api-access-srv99". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:20:04.647175 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.647134 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-oauth-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:04.647175 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.647166 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/474f3e92-6282-474e-8a1d-6ea06db92e0e-console-serving-cert\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:04.647175 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:04.647180 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-srv99\" (UniqueName: \"kubernetes.io/projected/474f3e92-6282-474e-8a1d-6ea06db92e0e-kube-api-access-srv99\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:05.221671 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.221636 2567 generic.go:358] "Generic (PLEG): container finished" podID="474f3e92-6282-474e-8a1d-6ea06db92e0e" containerID="8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018" exitCode=2 Apr 17 21:20:05.222049 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.221721 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b6d7b5f4-4zq48" Apr 17 21:20:05.222049 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.221720 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b6d7b5f4-4zq48" event={"ID":"474f3e92-6282-474e-8a1d-6ea06db92e0e","Type":"ContainerDied","Data":"8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018"} Apr 17 21:20:05.222049 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.221763 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b6d7b5f4-4zq48" event={"ID":"474f3e92-6282-474e-8a1d-6ea06db92e0e","Type":"ContainerDied","Data":"36f9b18208274f8422d8685ffcec2687c491a0ee50b30ee9fee40b6ad96e31bb"} Apr 17 21:20:05.222049 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.221786 2567 scope.go:117] "RemoveContainer" containerID="8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018" Apr 17 21:20:05.230653 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.230637 2567 scope.go:117] "RemoveContainer" containerID="8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018" Apr 17 21:20:05.230917 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:20:05.230899 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018\": container with ID starting with 8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018 not found: ID does not exist" containerID="8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018" Apr 17 21:20:05.230961 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.230926 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018"} err="failed to get container status \"8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018\": rpc error: code = NotFound desc = could not find container \"8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018\": container with ID starting with 8c3403dcd0cf1813c8855b4a18845fa939f7cb15fdedbe8564d98824aca8f018 not found: ID does not exist" Apr 17 21:20:05.253025 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.252990 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8b6d7b5f4-4zq48"] Apr 17 21:20:05.255579 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:05.255548 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8b6d7b5f4-4zq48"] Apr 17 21:20:07.055189 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:07.055156 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474f3e92-6282-474e-8a1d-6ea06db92e0e" path="/var/lib/kubelet/pods/474f3e92-6282-474e-8a1d-6ea06db92e0e/volumes" Apr 17 21:20:12.246568 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:12.246531 2567 generic.go:358] "Generic (PLEG): container finished" podID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerID="82768625c8111fd53f2a3931dcaed6a49588d0cd6d862203ce16cbaf79d33eee" exitCode=0 Apr 17 21:20:12.246985 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:12.246617 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" event={"ID":"4638f788-25bd-4afc-998d-9b44dd2b28dd","Type":"ContainerDied","Data":"82768625c8111fd53f2a3931dcaed6a49588d0cd6d862203ce16cbaf79d33eee"} Apr 17 21:20:13.366004 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.365983 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:20:13.425103 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.425065 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skddw\" (UniqueName: \"kubernetes.io/projected/4638f788-25bd-4afc-998d-9b44dd2b28dd-kube-api-access-skddw\") pod \"4638f788-25bd-4afc-998d-9b44dd2b28dd\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " Apr 17 21:20:13.425290 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.425159 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-bundle\") pod \"4638f788-25bd-4afc-998d-9b44dd2b28dd\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " Apr 17 21:20:13.425290 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.425210 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-util\") pod \"4638f788-25bd-4afc-998d-9b44dd2b28dd\" (UID: \"4638f788-25bd-4afc-998d-9b44dd2b28dd\") " Apr 17 21:20:13.425783 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.425758 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-bundle" (OuterVolumeSpecName: "bundle") pod "4638f788-25bd-4afc-998d-9b44dd2b28dd" (UID: "4638f788-25bd-4afc-998d-9b44dd2b28dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:20:13.427325 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.427305 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4638f788-25bd-4afc-998d-9b44dd2b28dd-kube-api-access-skddw" (OuterVolumeSpecName: "kube-api-access-skddw") pod "4638f788-25bd-4afc-998d-9b44dd2b28dd" (UID: "4638f788-25bd-4afc-998d-9b44dd2b28dd"). InnerVolumeSpecName "kube-api-access-skddw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:20:13.429263 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.429239 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-util" (OuterVolumeSpecName: "util") pod "4638f788-25bd-4afc-998d-9b44dd2b28dd" (UID: "4638f788-25bd-4afc-998d-9b44dd2b28dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:20:13.526566 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.526459 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skddw\" (UniqueName: \"kubernetes.io/projected/4638f788-25bd-4afc-998d-9b44dd2b28dd-kube-api-access-skddw\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:13.526566 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.526494 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:13.526566 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:13.526506 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4638f788-25bd-4afc-998d-9b44dd2b28dd-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:14.253761 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:14.253723 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" event={"ID":"4638f788-25bd-4afc-998d-9b44dd2b28dd","Type":"ContainerDied","Data":"35f814f8e6c612c961f17bf6f52f3055d6868608467dc6f45e9aa71b88bdded7"} Apr 17 21:20:14.253761 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:14.253762 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35f814f8e6c612c961f17bf6f52f3055d6868608467dc6f45e9aa71b88bdded7" Apr 17 21:20:14.253974 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:14.253733 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vbzl8" Apr 17 21:20:19.035716 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.035680 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj"] Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.035980 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerName="extract" Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.035990 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerName="extract" Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.036003 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="474f3e92-6282-474e-8a1d-6ea06db92e0e" containerName="console" Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.036008 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="474f3e92-6282-474e-8a1d-6ea06db92e0e" containerName="console" Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.036016 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerName="util" Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.036021 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerName="util" Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.036027 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerName="pull" Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.036032 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerName="pull" Apr 17 21:20:19.036082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.036083 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="474f3e92-6282-474e-8a1d-6ea06db92e0e" containerName="console" Apr 17 21:20:19.036334 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.036093 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4638f788-25bd-4afc-998d-9b44dd2b28dd" containerName="extract" Apr 17 21:20:19.040397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.040377 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" Apr 17 21:20:19.042894 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.042872 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-nq9tr\"" Apr 17 21:20:19.042894 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.042891 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 21:20:19.043048 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.042964 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:20:19.049972 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.049951 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj"] Apr 17 21:20:19.173149 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.173113 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttrqp\" (UniqueName: \"kubernetes.io/projected/91bc8c3f-272d-4d2f-8c71-c00a1cde0211-kube-api-access-ttrqp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nq6fj\" (UID: \"91bc8c3f-272d-4d2f-8c71-c00a1cde0211\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" Apr 17 21:20:19.173328 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.173162 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91bc8c3f-272d-4d2f-8c71-c00a1cde0211-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nq6fj\" (UID: \"91bc8c3f-272d-4d2f-8c71-c00a1cde0211\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" Apr 17 21:20:19.273762 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.273731 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttrqp\" (UniqueName: \"kubernetes.io/projected/91bc8c3f-272d-4d2f-8c71-c00a1cde0211-kube-api-access-ttrqp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nq6fj\" (UID: \"91bc8c3f-272d-4d2f-8c71-c00a1cde0211\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" Apr 17 21:20:19.273905 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.273769 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91bc8c3f-272d-4d2f-8c71-c00a1cde0211-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nq6fj\" (UID: \"91bc8c3f-272d-4d2f-8c71-c00a1cde0211\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" Apr 17 21:20:19.274109 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.274093 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/91bc8c3f-272d-4d2f-8c71-c00a1cde0211-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nq6fj\" (UID: \"91bc8c3f-272d-4d2f-8c71-c00a1cde0211\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" Apr 17 21:20:19.281686 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.281654 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttrqp\" (UniqueName: \"kubernetes.io/projected/91bc8c3f-272d-4d2f-8c71-c00a1cde0211-kube-api-access-ttrqp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nq6fj\" (UID: \"91bc8c3f-272d-4d2f-8c71-c00a1cde0211\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" Apr 17 21:20:19.350803 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.350776 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" Apr 17 21:20:19.479712 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:19.479683 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj"] Apr 17 21:20:19.481422 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:20:19.481399 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91bc8c3f_272d_4d2f_8c71_c00a1cde0211.slice/crio-4f4d944fb64d8565bf8fd7cf612cc3d1d027966d54e6ab39668c8175a90ce2db WatchSource:0}: Error finding container 4f4d944fb64d8565bf8fd7cf612cc3d1d027966d54e6ab39668c8175a90ce2db: Status 404 returned error can't find the container with id 4f4d944fb64d8565bf8fd7cf612cc3d1d027966d54e6ab39668c8175a90ce2db Apr 17 21:20:20.277255 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:20.277202 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" event={"ID":"91bc8c3f-272d-4d2f-8c71-c00a1cde0211","Type":"ContainerStarted","Data":"4f4d944fb64d8565bf8fd7cf612cc3d1d027966d54e6ab39668c8175a90ce2db"} Apr 17 21:20:23.289206 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:23.289126 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" event={"ID":"91bc8c3f-272d-4d2f-8c71-c00a1cde0211","Type":"ContainerStarted","Data":"1f595abbe32852dda92c5132669f23e62b92cb8d2b867d3e1c4921087dcaadcb"} Apr 17 21:20:23.310417 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:23.310362 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nq6fj" podStartSLOduration=0.816814494 podStartE2EDuration="4.310343539s" podCreationTimestamp="2026-04-17 21:20:19 +0000 UTC" firstStartedPulling="2026-04-17 21:20:19.48377074 +0000 UTC m=+375.053283474" lastFinishedPulling="2026-04-17 21:20:22.977299788 +0000 UTC m=+378.546812519" observedRunningTime="2026-04-17 21:20:23.308135991 +0000 UTC m=+378.877648743" watchObservedRunningTime="2026-04-17 21:20:23.310343539 +0000 UTC m=+378.879856293" Apr 17 21:20:24.949890 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:24.949854 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd"] Apr 17 21:20:24.953179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:24.953158 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:24.955782 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:24.955760 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:20:24.956871 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:24.956854 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-t66c9\"" Apr 17 21:20:24.956871 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:24.956870 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:20:24.961641 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:24.961615 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd"] Apr 17 21:20:25.125825 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.125783 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.126039 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.125838 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.126039 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.126011 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjff\" (UniqueName: \"kubernetes.io/projected/2161f627-afcb-42c6-8572-9fca96308b89-kube-api-access-tzjff\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.226617 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.226495 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.226806 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.226632 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjff\" (UniqueName: \"kubernetes.io/projected/2161f627-afcb-42c6-8572-9fca96308b89-kube-api-access-tzjff\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.226806 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.226656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.226926 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.226877 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.226926 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.226915 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.234237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.234212 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjff\" (UniqueName: \"kubernetes.io/projected/2161f627-afcb-42c6-8572-9fca96308b89-kube-api-access-tzjff\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.263212 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.263185 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:25.379947 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:25.379918 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd"] Apr 17 21:20:25.381614 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:20:25.381585 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2161f627_afcb_42c6_8572_9fca96308b89.slice/crio-047e9d2aff54a6ffdfcb607ac1c71b1fda53fda4206e1c7a04029aceb331ea1a WatchSource:0}: Error finding container 047e9d2aff54a6ffdfcb607ac1c71b1fda53fda4206e1c7a04029aceb331ea1a: Status 404 returned error can't find the container with id 047e9d2aff54a6ffdfcb607ac1c71b1fda53fda4206e1c7a04029aceb331ea1a Apr 17 21:20:26.300909 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:26.300873 2567 generic.go:358] "Generic (PLEG): container finished" podID="2161f627-afcb-42c6-8572-9fca96308b89" containerID="c956899a995c5177cdc7a7304ac13122b425f59058705afa27b4d8e8d332f25e" exitCode=0 Apr 17 21:20:26.300909 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:26.300912 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" event={"ID":"2161f627-afcb-42c6-8572-9fca96308b89","Type":"ContainerDied","Data":"c956899a995c5177cdc7a7304ac13122b425f59058705afa27b4d8e8d332f25e"} Apr 17 21:20:26.301376 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:26.300939 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" event={"ID":"2161f627-afcb-42c6-8572-9fca96308b89","Type":"ContainerStarted","Data":"047e9d2aff54a6ffdfcb607ac1c71b1fda53fda4206e1c7a04029aceb331ea1a"} Apr 17 21:20:27.033175 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.033139 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-h59ln"] Apr 17 21:20:27.036617 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.036596 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:27.039028 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.039008 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 21:20:27.040139 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.040120 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-f78fb\"" Apr 17 21:20:27.040245 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.040142 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 21:20:27.044057 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.044030 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-h59ln"] Apr 17 21:20:27.143719 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.143682 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/441380f5-0e3d-45dc-a269-973efcd11862-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-h59ln\" (UID: \"441380f5-0e3d-45dc-a269-973efcd11862\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:27.143905 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.143788 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4srf9\" (UniqueName: \"kubernetes.io/projected/441380f5-0e3d-45dc-a269-973efcd11862-kube-api-access-4srf9\") pod \"cert-manager-webhook-597b96b99b-h59ln\" (UID: \"441380f5-0e3d-45dc-a269-973efcd11862\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:27.244655 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.244616 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/441380f5-0e3d-45dc-a269-973efcd11862-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-h59ln\" (UID: \"441380f5-0e3d-45dc-a269-973efcd11862\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:27.244858 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.244709 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4srf9\" (UniqueName: \"kubernetes.io/projected/441380f5-0e3d-45dc-a269-973efcd11862-kube-api-access-4srf9\") pod \"cert-manager-webhook-597b96b99b-h59ln\" (UID: \"441380f5-0e3d-45dc-a269-973efcd11862\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:27.253162 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.253126 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/441380f5-0e3d-45dc-a269-973efcd11862-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-h59ln\" (UID: \"441380f5-0e3d-45dc-a269-973efcd11862\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:27.253331 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.253309 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4srf9\" (UniqueName: \"kubernetes.io/projected/441380f5-0e3d-45dc-a269-973efcd11862-kube-api-access-4srf9\") pod \"cert-manager-webhook-597b96b99b-h59ln\" (UID: \"441380f5-0e3d-45dc-a269-973efcd11862\") " pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:27.356155 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.356123 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:27.496658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:27.496631 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-h59ln"] Apr 17 21:20:27.499566 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:20:27.499530 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod441380f5_0e3d_45dc_a269_973efcd11862.slice/crio-980062f38077e96585ea776b8bd6242563399c97838b5869e9a6dd64b1fc2e26 WatchSource:0}: Error finding container 980062f38077e96585ea776b8bd6242563399c97838b5869e9a6dd64b1fc2e26: Status 404 returned error can't find the container with id 980062f38077e96585ea776b8bd6242563399c97838b5869e9a6dd64b1fc2e26 Apr 17 21:20:28.309568 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:28.309530 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" event={"ID":"441380f5-0e3d-45dc-a269-973efcd11862","Type":"ContainerStarted","Data":"980062f38077e96585ea776b8bd6242563399c97838b5869e9a6dd64b1fc2e26"} Apr 17 21:20:29.315361 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:29.315326 2567 generic.go:358] "Generic (PLEG): container finished" podID="2161f627-afcb-42c6-8572-9fca96308b89" containerID="94e4834c2176c21036ff0b630a804633c16d2a07fac4a66411ac7b281f15e28f" exitCode=0 Apr 17 21:20:29.315840 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:29.315384 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" event={"ID":"2161f627-afcb-42c6-8572-9fca96308b89","Type":"ContainerDied","Data":"94e4834c2176c21036ff0b630a804633c16d2a07fac4a66411ac7b281f15e28f"} Apr 17 21:20:30.324882 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:30.324846 2567 generic.go:358] "Generic (PLEG): container finished" podID="2161f627-afcb-42c6-8572-9fca96308b89" containerID="1ccb1f3960756ec04ae80f51a5299116b5a98e53e1ca15c3f93dd75e2859bcdb" exitCode=0 Apr 17 21:20:30.325243 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:30.324898 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" event={"ID":"2161f627-afcb-42c6-8572-9fca96308b89","Type":"ContainerDied","Data":"1ccb1f3960756ec04ae80f51a5299116b5a98e53e1ca15c3f93dd75e2859bcdb"} Apr 17 21:20:31.329939 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.329900 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" event={"ID":"441380f5-0e3d-45dc-a269-973efcd11862","Type":"ContainerStarted","Data":"a21f6aa6ceaec2447cf06b35788a27622977fc0d17806cd1b5742cce267eb92c"} Apr 17 21:20:31.330388 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.330006 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:31.345817 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.345762 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" podStartSLOduration=1.242165296 podStartE2EDuration="4.345747285s" podCreationTimestamp="2026-04-17 21:20:27 +0000 UTC" firstStartedPulling="2026-04-17 21:20:27.501693746 +0000 UTC m=+383.071206478" lastFinishedPulling="2026-04-17 21:20:30.605275737 +0000 UTC m=+386.174788467" observedRunningTime="2026-04-17 21:20:31.344716427 +0000 UTC m=+386.914229179" watchObservedRunningTime="2026-04-17 21:20:31.345747285 +0000 UTC m=+386.915260038" Apr 17 21:20:31.461027 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.460997 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:31.585509 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.585397 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-util\") pod \"2161f627-afcb-42c6-8572-9fca96308b89\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " Apr 17 21:20:31.585681 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.585550 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-bundle\") pod \"2161f627-afcb-42c6-8572-9fca96308b89\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " Apr 17 21:20:31.585681 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.585583 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzjff\" (UniqueName: \"kubernetes.io/projected/2161f627-afcb-42c6-8572-9fca96308b89-kube-api-access-tzjff\") pod \"2161f627-afcb-42c6-8572-9fca96308b89\" (UID: \"2161f627-afcb-42c6-8572-9fca96308b89\") " Apr 17 21:20:31.585945 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.585924 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-bundle" (OuterVolumeSpecName: "bundle") pod "2161f627-afcb-42c6-8572-9fca96308b89" (UID: "2161f627-afcb-42c6-8572-9fca96308b89"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:20:31.587750 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.587722 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2161f627-afcb-42c6-8572-9fca96308b89-kube-api-access-tzjff" (OuterVolumeSpecName: "kube-api-access-tzjff") pod "2161f627-afcb-42c6-8572-9fca96308b89" (UID: "2161f627-afcb-42c6-8572-9fca96308b89"). InnerVolumeSpecName "kube-api-access-tzjff". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:20:31.590059 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.590035 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-util" (OuterVolumeSpecName: "util") pod "2161f627-afcb-42c6-8572-9fca96308b89" (UID: "2161f627-afcb-42c6-8572-9fca96308b89"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:20:31.686978 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.686938 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tzjff\" (UniqueName: \"kubernetes.io/projected/2161f627-afcb-42c6-8572-9fca96308b89-kube-api-access-tzjff\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:31.686978 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.686970 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:31.686978 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:31.686979 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2161f627-afcb-42c6-8572-9fca96308b89-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:32.335162 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.335133 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" Apr 17 21:20:32.335162 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.335137 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmpfgd" event={"ID":"2161f627-afcb-42c6-8572-9fca96308b89","Type":"ContainerDied","Data":"047e9d2aff54a6ffdfcb607ac1c71b1fda53fda4206e1c7a04029aceb331ea1a"} Apr 17 21:20:32.335162 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.335169 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="047e9d2aff54a6ffdfcb607ac1c71b1fda53fda4206e1c7a04029aceb331ea1a" Apr 17 21:20:32.430169 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.430135 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hfddf"] Apr 17 21:20:32.430456 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.430445 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2161f627-afcb-42c6-8572-9fca96308b89" containerName="util" Apr 17 21:20:32.430494 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.430458 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2161f627-afcb-42c6-8572-9fca96308b89" containerName="util" Apr 17 21:20:32.430494 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.430466 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2161f627-afcb-42c6-8572-9fca96308b89" containerName="pull" Apr 17 21:20:32.430494 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.430472 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2161f627-afcb-42c6-8572-9fca96308b89" containerName="pull" Apr 17 21:20:32.430494 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.430481 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2161f627-afcb-42c6-8572-9fca96308b89" containerName="extract" Apr 17 21:20:32.430494 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.430487 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="2161f627-afcb-42c6-8572-9fca96308b89" containerName="extract" Apr 17 21:20:32.430655 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.430561 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="2161f627-afcb-42c6-8572-9fca96308b89" containerName="extract" Apr 17 21:20:32.433277 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.433258 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" Apr 17 21:20:32.436106 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.436085 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-65rv8\"" Apr 17 21:20:32.443152 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.443128 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hfddf"] Apr 17 21:20:32.596501 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.596470 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75jkr\" (UniqueName: \"kubernetes.io/projected/3f2f33ea-64c1-4d80-831b-b532bf55572b-kube-api-access-75jkr\") pod \"cert-manager-cainjector-8966b78d4-hfddf\" (UID: \"3f2f33ea-64c1-4d80-831b-b532bf55572b\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" Apr 17 21:20:32.596678 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.596530 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f2f33ea-64c1-4d80-831b-b532bf55572b-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hfddf\" (UID: \"3f2f33ea-64c1-4d80-831b-b532bf55572b\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" Apr 17 21:20:32.697191 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.697153 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75jkr\" (UniqueName: \"kubernetes.io/projected/3f2f33ea-64c1-4d80-831b-b532bf55572b-kube-api-access-75jkr\") pod \"cert-manager-cainjector-8966b78d4-hfddf\" (UID: \"3f2f33ea-64c1-4d80-831b-b532bf55572b\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" Apr 17 21:20:32.697349 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.697201 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f2f33ea-64c1-4d80-831b-b532bf55572b-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hfddf\" (UID: \"3f2f33ea-64c1-4d80-831b-b532bf55572b\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" Apr 17 21:20:32.705371 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.705331 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f2f33ea-64c1-4d80-831b-b532bf55572b-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-hfddf\" (UID: \"3f2f33ea-64c1-4d80-831b-b532bf55572b\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" Apr 17 21:20:32.705542 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.705412 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75jkr\" (UniqueName: \"kubernetes.io/projected/3f2f33ea-64c1-4d80-831b-b532bf55572b-kube-api-access-75jkr\") pod \"cert-manager-cainjector-8966b78d4-hfddf\" (UID: \"3f2f33ea-64c1-4d80-831b-b532bf55572b\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" Apr 17 21:20:32.743319 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.743288 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" Apr 17 21:20:32.862601 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:32.862575 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-hfddf"] Apr 17 21:20:32.864400 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:20:32.864371 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f2f33ea_64c1_4d80_831b_b532bf55572b.slice/crio-68142abf60edb187c2cb2152b8d9117ce2a56305b03a3290f3c41300e26c7597 WatchSource:0}: Error finding container 68142abf60edb187c2cb2152b8d9117ce2a56305b03a3290f3c41300e26c7597: Status 404 returned error can't find the container with id 68142abf60edb187c2cb2152b8d9117ce2a56305b03a3290f3c41300e26c7597 Apr 17 21:20:33.339658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:33.339565 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" event={"ID":"3f2f33ea-64c1-4d80-831b-b532bf55572b","Type":"ContainerStarted","Data":"508d638d380e83e604d328261e708fadbc5585c65579a87b9516ca24802eacca"} Apr 17 21:20:33.339658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:33.339604 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" event={"ID":"3f2f33ea-64c1-4d80-831b-b532bf55572b","Type":"ContainerStarted","Data":"68142abf60edb187c2cb2152b8d9117ce2a56305b03a3290f3c41300e26c7597"} Apr 17 21:20:33.354171 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:33.354126 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-hfddf" podStartSLOduration=1.354111965 podStartE2EDuration="1.354111965s" podCreationTimestamp="2026-04-17 21:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:20:33.353485727 +0000 UTC m=+388.922998481" watchObservedRunningTime="2026-04-17 21:20:33.354111965 +0000 UTC m=+388.923624718" Apr 17 21:20:37.337778 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:37.337745 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-h59ln" Apr 17 21:20:44.876246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.876211 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-tdr6r"] Apr 17 21:20:44.883380 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.883353 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-tdr6r" Apr 17 21:20:44.883603 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.883577 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e104e62-77f2-4595-ac7f-9bb1db3209fd-bound-sa-token\") pod \"cert-manager-759f64656b-tdr6r\" (UID: \"4e104e62-77f2-4595-ac7f-9bb1db3209fd\") " pod="cert-manager/cert-manager-759f64656b-tdr6r" Apr 17 21:20:44.883690 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.883649 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r2jm\" (UniqueName: \"kubernetes.io/projected/4e104e62-77f2-4595-ac7f-9bb1db3209fd-kube-api-access-6r2jm\") pod \"cert-manager-759f64656b-tdr6r\" (UID: \"4e104e62-77f2-4595-ac7f-9bb1db3209fd\") " pod="cert-manager/cert-manager-759f64656b-tdr6r" Apr 17 21:20:44.886270 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.886248 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-mb8xx\"" Apr 17 21:20:44.888285 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.888261 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-tdr6r"] Apr 17 21:20:44.984678 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.984649 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e104e62-77f2-4595-ac7f-9bb1db3209fd-bound-sa-token\") pod \"cert-manager-759f64656b-tdr6r\" (UID: \"4e104e62-77f2-4595-ac7f-9bb1db3209fd\") " pod="cert-manager/cert-manager-759f64656b-tdr6r" Apr 17 21:20:44.984845 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.984699 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r2jm\" (UniqueName: \"kubernetes.io/projected/4e104e62-77f2-4595-ac7f-9bb1db3209fd-kube-api-access-6r2jm\") pod \"cert-manager-759f64656b-tdr6r\" (UID: \"4e104e62-77f2-4595-ac7f-9bb1db3209fd\") " pod="cert-manager/cert-manager-759f64656b-tdr6r" Apr 17 21:20:44.992153 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.992122 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e104e62-77f2-4595-ac7f-9bb1db3209fd-bound-sa-token\") pod \"cert-manager-759f64656b-tdr6r\" (UID: \"4e104e62-77f2-4595-ac7f-9bb1db3209fd\") " pod="cert-manager/cert-manager-759f64656b-tdr6r" Apr 17 21:20:44.992324 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:44.992305 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r2jm\" (UniqueName: \"kubernetes.io/projected/4e104e62-77f2-4595-ac7f-9bb1db3209fd-kube-api-access-6r2jm\") pod \"cert-manager-759f64656b-tdr6r\" (UID: \"4e104e62-77f2-4595-ac7f-9bb1db3209fd\") " pod="cert-manager/cert-manager-759f64656b-tdr6r" Apr 17 21:20:45.194209 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.194116 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-tdr6r" Apr 17 21:20:45.312281 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.312253 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-tdr6r"] Apr 17 21:20:45.314384 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:20:45.314353 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e104e62_77f2_4595_ac7f_9bb1db3209fd.slice/crio-6b2b87ad462d0cc934e372dda4b97d1188dd10b2a1eecb219b9e89001448c416 WatchSource:0}: Error finding container 6b2b87ad462d0cc934e372dda4b97d1188dd10b2a1eecb219b9e89001448c416: Status 404 returned error can't find the container with id 6b2b87ad462d0cc934e372dda4b97d1188dd10b2a1eecb219b9e89001448c416 Apr 17 21:20:45.385814 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.385763 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-tdr6r" event={"ID":"4e104e62-77f2-4595-ac7f-9bb1db3209fd","Type":"ContainerStarted","Data":"8d4305ef39e0b3a9566456c604287f8008276bdf0844750a8276d947f5a1241e"} Apr 17 21:20:45.385952 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.385830 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-tdr6r" event={"ID":"4e104e62-77f2-4595-ac7f-9bb1db3209fd","Type":"ContainerStarted","Data":"6b2b87ad462d0cc934e372dda4b97d1188dd10b2a1eecb219b9e89001448c416"} Apr 17 21:20:45.400500 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.400457 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-tdr6r" podStartSLOduration=1.400442072 podStartE2EDuration="1.400442072s" podCreationTimestamp="2026-04-17 21:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:20:45.39958027 +0000 UTC m=+400.969093022" watchObservedRunningTime="2026-04-17 21:20:45.400442072 +0000 UTC m=+400.969954824" Apr 17 21:20:45.783905 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.783864 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2"] Apr 17 21:20:45.787500 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.787484 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.790028 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.790003 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-t66c9\"" Apr 17 21:20:45.790154 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.790026 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:20:45.790154 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.790026 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:20:45.790381 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.790361 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.790429 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.790417 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv65g\" (UniqueName: \"kubernetes.io/projected/e146fc31-7b76-4790-ae73-59d7ffcefacc-kube-api-access-hv65g\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.790572 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.790556 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.798246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.798222 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2"] Apr 17 21:20:45.890906 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.890874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hv65g\" (UniqueName: \"kubernetes.io/projected/e146fc31-7b76-4790-ae73-59d7ffcefacc-kube-api-access-hv65g\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.891339 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.890963 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.891339 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.891009 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.891339 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.891323 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.891486 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.891378 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:45.898447 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:45.898419 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv65g\" (UniqueName: \"kubernetes.io/projected/e146fc31-7b76-4790-ae73-59d7ffcefacc-kube-api-access-hv65g\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:46.096843 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:46.096816 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:46.221610 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:46.221581 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2"] Apr 17 21:20:46.223538 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:20:46.223496 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode146fc31_7b76_4790_ae73_59d7ffcefacc.slice/crio-f38e44830154e49758cae0b224dd7465e2dfcd8ca5bbe50c7d7a468970709e13 WatchSource:0}: Error finding container f38e44830154e49758cae0b224dd7465e2dfcd8ca5bbe50c7d7a468970709e13: Status 404 returned error can't find the container with id f38e44830154e49758cae0b224dd7465e2dfcd8ca5bbe50c7d7a468970709e13 Apr 17 21:20:46.391167 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:46.391082 2567 generic.go:358] "Generic (PLEG): container finished" podID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerID="d9e3a377bbc4d5e4880a207e80123cec582322fbf836c7f95b4d4d1b9cf5905e" exitCode=0 Apr 17 21:20:46.391303 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:46.391165 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" event={"ID":"e146fc31-7b76-4790-ae73-59d7ffcefacc","Type":"ContainerDied","Data":"d9e3a377bbc4d5e4880a207e80123cec582322fbf836c7f95b4d4d1b9cf5905e"} Apr 17 21:20:46.391303 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:46.391201 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" event={"ID":"e146fc31-7b76-4790-ae73-59d7ffcefacc","Type":"ContainerStarted","Data":"f38e44830154e49758cae0b224dd7465e2dfcd8ca5bbe50c7d7a468970709e13"} Apr 17 21:20:47.396349 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:47.396312 2567 generic.go:358] "Generic (PLEG): container finished" podID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerID="95244ffadd731281186cfd647398dec160a6827263e8d1835892c32cc07b6122" exitCode=0 Apr 17 21:20:47.396853 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:47.396401 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" event={"ID":"e146fc31-7b76-4790-ae73-59d7ffcefacc","Type":"ContainerDied","Data":"95244ffadd731281186cfd647398dec160a6827263e8d1835892c32cc07b6122"} Apr 17 21:20:48.401605 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:48.401572 2567 generic.go:358] "Generic (PLEG): container finished" podID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerID="654a7261722140fe24ddc5f8280076036dcc794a5cd689e1eb0e9e04203fc8df" exitCode=0 Apr 17 21:20:48.402011 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:48.401631 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" event={"ID":"e146fc31-7b76-4790-ae73-59d7ffcefacc","Type":"ContainerDied","Data":"654a7261722140fe24ddc5f8280076036dcc794a5cd689e1eb0e9e04203fc8df"} Apr 17 21:20:49.524253 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.524227 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:49.623976 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.623942 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-util\") pod \"e146fc31-7b76-4790-ae73-59d7ffcefacc\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " Apr 17 21:20:49.623976 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.623982 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-bundle\") pod \"e146fc31-7b76-4790-ae73-59d7ffcefacc\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " Apr 17 21:20:49.624191 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.624044 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv65g\" (UniqueName: \"kubernetes.io/projected/e146fc31-7b76-4790-ae73-59d7ffcefacc-kube-api-access-hv65g\") pod \"e146fc31-7b76-4790-ae73-59d7ffcefacc\" (UID: \"e146fc31-7b76-4790-ae73-59d7ffcefacc\") " Apr 17 21:20:49.624757 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.624727 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-bundle" (OuterVolumeSpecName: "bundle") pod "e146fc31-7b76-4790-ae73-59d7ffcefacc" (UID: "e146fc31-7b76-4790-ae73-59d7ffcefacc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:20:49.626132 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.626106 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e146fc31-7b76-4790-ae73-59d7ffcefacc-kube-api-access-hv65g" (OuterVolumeSpecName: "kube-api-access-hv65g") pod "e146fc31-7b76-4790-ae73-59d7ffcefacc" (UID: "e146fc31-7b76-4790-ae73-59d7ffcefacc"). InnerVolumeSpecName "kube-api-access-hv65g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:20:49.629409 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.629374 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-util" (OuterVolumeSpecName: "util") pod "e146fc31-7b76-4790-ae73-59d7ffcefacc" (UID: "e146fc31-7b76-4790-ae73-59d7ffcefacc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:20:49.725246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.725153 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:49.725246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.725184 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e146fc31-7b76-4790-ae73-59d7ffcefacc-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:49.725246 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:49.725194 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hv65g\" (UniqueName: \"kubernetes.io/projected/e146fc31-7b76-4790-ae73-59d7ffcefacc-kube-api-access-hv65g\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:20:50.410577 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:50.410540 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" event={"ID":"e146fc31-7b76-4790-ae73-59d7ffcefacc","Type":"ContainerDied","Data":"f38e44830154e49758cae0b224dd7465e2dfcd8ca5bbe50c7d7a468970709e13"} Apr 17 21:20:50.410577 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:50.410574 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f38e44830154e49758cae0b224dd7465e2dfcd8ca5bbe50c7d7a468970709e13" Apr 17 21:20:50.410577 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:50.410579 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5txfm2" Apr 17 21:20:59.579828 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.579793 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b"] Apr 17 21:20:59.580204 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.580126 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerName="util" Apr 17 21:20:59.580204 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.580136 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerName="util" Apr 17 21:20:59.580204 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.580146 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerName="extract" Apr 17 21:20:59.580204 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.580154 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerName="extract" Apr 17 21:20:59.580204 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.580161 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerName="pull" Apr 17 21:20:59.580204 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.580166 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerName="pull" Apr 17 21:20:59.580387 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.580219 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e146fc31-7b76-4790-ae73-59d7ffcefacc" containerName="extract" Apr 17 21:20:59.583313 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.583296 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.585839 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.585816 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:20:59.585954 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.585874 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:20:59.586914 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.586899 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-t66c9\"" Apr 17 21:20:59.590330 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.590308 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b"] Apr 17 21:20:59.615660 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.615630 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.615795 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.615689 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q72c\" (UniqueName: \"kubernetes.io/projected/875ec8ac-b951-44bd-9211-47fa0cf2140a-kube-api-access-8q72c\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.615795 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.615728 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.716898 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.716860 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.717069 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.716908 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.717069 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.716975 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q72c\" (UniqueName: \"kubernetes.io/projected/875ec8ac-b951-44bd-9211-47fa0cf2140a-kube-api-access-8q72c\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.717291 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.717257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.717291 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.717284 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.724898 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.724860 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q72c\" (UniqueName: \"kubernetes.io/projected/875ec8ac-b951-44bd-9211-47fa0cf2140a-kube-api-access-8q72c\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:20:59.893509 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:20:59.893472 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:21:00.034545 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.034504 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b"] Apr 17 21:21:00.452705 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.452669 2567 generic.go:358] "Generic (PLEG): container finished" podID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerID="0287e6aabf26052ae3bb69752ee42bb9f42ea14c0afe18218b77c3e01a3098e8" exitCode=0 Apr 17 21:21:00.452940 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.452758 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" event={"ID":"875ec8ac-b951-44bd-9211-47fa0cf2140a","Type":"ContainerDied","Data":"0287e6aabf26052ae3bb69752ee42bb9f42ea14c0afe18218b77c3e01a3098e8"} Apr 17 21:21:00.452940 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.452798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" event={"ID":"875ec8ac-b951-44bd-9211-47fa0cf2140a","Type":"ContainerStarted","Data":"f03a8a39f3eb68bf15ccb4c431d2e1fd296444667a5f8e452873b12c5645692a"} Apr 17 21:21:00.507129 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.507095 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc"] Apr 17 21:21:00.510305 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.510287 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.513227 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.513196 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 21:21:00.513227 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.513215 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 21:21:00.513414 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.513223 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 21:21:00.513414 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.513274 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-2j2w8\"" Apr 17 21:21:00.513414 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.513241 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 21:21:00.526161 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.526133 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc"] Apr 17 21:21:00.627015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.626979 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/463b17e4-2b3e-46d5-affc-05862505d3ba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.627015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.627022 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/463b17e4-2b3e-46d5-affc-05862505d3ba-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.627420 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.627071 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctt65\" (UniqueName: \"kubernetes.io/projected/463b17e4-2b3e-46d5-affc-05862505d3ba-kube-api-access-ctt65\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.728430 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.728288 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctt65\" (UniqueName: \"kubernetes.io/projected/463b17e4-2b3e-46d5-affc-05862505d3ba-kube-api-access-ctt65\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.728430 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.728416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/463b17e4-2b3e-46d5-affc-05862505d3ba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.728659 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.728447 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/463b17e4-2b3e-46d5-affc-05862505d3ba-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.730862 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.730829 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/463b17e4-2b3e-46d5-affc-05862505d3ba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.730973 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.730878 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/463b17e4-2b3e-46d5-affc-05862505d3ba-webhook-cert\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.742863 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.742837 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctt65\" (UniqueName: \"kubernetes.io/projected/463b17e4-2b3e-46d5-affc-05862505d3ba-kube-api-access-ctt65\") pod \"opendatahub-operator-controller-manager-694fdf7c65-95kfc\" (UID: \"463b17e4-2b3e-46d5-affc-05862505d3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.821970 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.821931 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:00.957073 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:00.957043 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc"] Apr 17 21:21:00.975159 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:21:00.975123 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod463b17e4_2b3e_46d5_affc_05862505d3ba.slice/crio-206e6e7d807914ef9f471b3edf229378fa5bb95145efb6d3accbfeaf0d55d8dd WatchSource:0}: Error finding container 206e6e7d807914ef9f471b3edf229378fa5bb95145efb6d3accbfeaf0d55d8dd: Status 404 returned error can't find the container with id 206e6e7d807914ef9f471b3edf229378fa5bb95145efb6d3accbfeaf0d55d8dd Apr 17 21:21:01.459652 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:01.459496 2567 generic.go:358] "Generic (PLEG): container finished" podID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerID="37b70f786231d3e7e60a2dfaeb48e042cdd99490559369a961e6250daece8bee" exitCode=0 Apr 17 21:21:01.459652 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:01.459609 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" event={"ID":"875ec8ac-b951-44bd-9211-47fa0cf2140a","Type":"ContainerDied","Data":"37b70f786231d3e7e60a2dfaeb48e042cdd99490559369a961e6250daece8bee"} Apr 17 21:21:01.461676 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:01.461643 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" event={"ID":"463b17e4-2b3e-46d5-affc-05862505d3ba","Type":"ContainerStarted","Data":"206e6e7d807914ef9f471b3edf229378fa5bb95145efb6d3accbfeaf0d55d8dd"} Apr 17 21:21:02.469701 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:02.469663 2567 generic.go:358] "Generic (PLEG): container finished" podID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerID="5e194c3678685d6d304ad0738f078f1bc4dc226718d970e46fb1cc14655cdbe3" exitCode=0 Apr 17 21:21:02.470082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:02.469718 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" event={"ID":"875ec8ac-b951-44bd-9211-47fa0cf2140a","Type":"ContainerDied","Data":"5e194c3678685d6d304ad0738f078f1bc4dc226718d970e46fb1cc14655cdbe3"} Apr 17 21:21:03.475333 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.475298 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" event={"ID":"463b17e4-2b3e-46d5-affc-05862505d3ba","Type":"ContainerStarted","Data":"4be83bd11f089dcca3db8b01ce50ac83031dc26238600e24ead64f9bc4301397"} Apr 17 21:21:03.475739 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.475424 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:03.496617 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.496568 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" podStartSLOduration=1.110748098 podStartE2EDuration="3.496554442s" podCreationTimestamp="2026-04-17 21:21:00 +0000 UTC" firstStartedPulling="2026-04-17 21:21:00.976924248 +0000 UTC m=+416.546436980" lastFinishedPulling="2026-04-17 21:21:03.362730589 +0000 UTC m=+418.932243324" observedRunningTime="2026-04-17 21:21:03.494165135 +0000 UTC m=+419.063677914" watchObservedRunningTime="2026-04-17 21:21:03.496554442 +0000 UTC m=+419.066067196" Apr 17 21:21:03.599297 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.599266 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:21:03.656645 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.656564 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q72c\" (UniqueName: \"kubernetes.io/projected/875ec8ac-b951-44bd-9211-47fa0cf2140a-kube-api-access-8q72c\") pod \"875ec8ac-b951-44bd-9211-47fa0cf2140a\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " Apr 17 21:21:03.656645 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.656632 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-bundle\") pod \"875ec8ac-b951-44bd-9211-47fa0cf2140a\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " Apr 17 21:21:03.656844 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.656696 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-util\") pod \"875ec8ac-b951-44bd-9211-47fa0cf2140a\" (UID: \"875ec8ac-b951-44bd-9211-47fa0cf2140a\") " Apr 17 21:21:03.657448 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.657409 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-bundle" (OuterVolumeSpecName: "bundle") pod "875ec8ac-b951-44bd-9211-47fa0cf2140a" (UID: "875ec8ac-b951-44bd-9211-47fa0cf2140a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:21:03.658687 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.658662 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/875ec8ac-b951-44bd-9211-47fa0cf2140a-kube-api-access-8q72c" (OuterVolumeSpecName: "kube-api-access-8q72c") pod "875ec8ac-b951-44bd-9211-47fa0cf2140a" (UID: "875ec8ac-b951-44bd-9211-47fa0cf2140a"). InnerVolumeSpecName "kube-api-access-8q72c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:21:03.662098 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.662073 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-util" (OuterVolumeSpecName: "util") pod "875ec8ac-b951-44bd-9211-47fa0cf2140a" (UID: "875ec8ac-b951-44bd-9211-47fa0cf2140a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:21:03.757799 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.757763 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:03.757799 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.757792 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8q72c\" (UniqueName: \"kubernetes.io/projected/875ec8ac-b951-44bd-9211-47fa0cf2140a-kube-api-access-8q72c\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:03.757799 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:03.757805 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/875ec8ac-b951-44bd-9211-47fa0cf2140a-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:04.480957 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:04.480927 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" Apr 17 21:21:04.481327 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:04.480924 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c96vm9b" event={"ID":"875ec8ac-b951-44bd-9211-47fa0cf2140a","Type":"ContainerDied","Data":"f03a8a39f3eb68bf15ccb4c431d2e1fd296444667a5f8e452873b12c5645692a"} Apr 17 21:21:04.481327 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:04.481037 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f03a8a39f3eb68bf15ccb4c431d2e1fd296444667a5f8e452873b12c5645692a" Apr 17 21:21:12.476202 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.476169 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92"] Apr 17 21:21:12.476694 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.476676 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerName="pull" Apr 17 21:21:12.476743 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.476699 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerName="pull" Apr 17 21:21:12.476743 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.476726 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerName="extract" Apr 17 21:21:12.476743 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.476735 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerName="extract" Apr 17 21:21:12.476838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.476766 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerName="util" Apr 17 21:21:12.476838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.476775 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerName="util" Apr 17 21:21:12.476905 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.476868 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="875ec8ac-b951-44bd-9211-47fa0cf2140a" containerName="extract" Apr 17 21:21:12.481821 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.481802 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.485359 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.485338 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 21:21:12.486629 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.486613 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 21:21:12.486710 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.486693 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 21:21:12.486904 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.486888 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 21:21:12.486985 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.486907 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:21:12.486985 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.486919 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-z2pgk\"" Apr 17 21:21:12.492158 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.492137 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92"] Apr 17 21:21:12.631281 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.631243 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/348290e3-f3af-4dd5-9881-2b6543bd7481-metrics-cert\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.631560 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.631297 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/348290e3-f3af-4dd5-9881-2b6543bd7481-cert\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.631560 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.631316 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jzcx\" (UniqueName: \"kubernetes.io/projected/348290e3-f3af-4dd5-9881-2b6543bd7481-kube-api-access-6jzcx\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.631560 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.631381 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/348290e3-f3af-4dd5-9881-2b6543bd7481-manager-config\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.732196 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.732103 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/348290e3-f3af-4dd5-9881-2b6543bd7481-cert\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.732196 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.732138 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jzcx\" (UniqueName: \"kubernetes.io/projected/348290e3-f3af-4dd5-9881-2b6543bd7481-kube-api-access-6jzcx\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.732196 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.732171 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/348290e3-f3af-4dd5-9881-2b6543bd7481-manager-config\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.732590 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.732262 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/348290e3-f3af-4dd5-9881-2b6543bd7481-metrics-cert\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.733205 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.733177 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/348290e3-f3af-4dd5-9881-2b6543bd7481-manager-config\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.735254 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.735232 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/348290e3-f3af-4dd5-9881-2b6543bd7481-cert\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.735368 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.735299 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/348290e3-f3af-4dd5-9881-2b6543bd7481-metrics-cert\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.743690 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.743658 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jzcx\" (UniqueName: \"kubernetes.io/projected/348290e3-f3af-4dd5-9881-2b6543bd7481-kube-api-access-6jzcx\") pod \"lws-controller-manager-54f8864c6c-4wj92\" (UID: \"348290e3-f3af-4dd5-9881-2b6543bd7481\") " pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.791847 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.791809 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:12.913589 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:12.913560 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92"] Apr 17 21:21:12.915430 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:21:12.915407 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod348290e3_f3af_4dd5_9881_2b6543bd7481.slice/crio-89401e04413f7ef002e13e9715b964c7ae99a764789d37e6d0072d3f0a101764 WatchSource:0}: Error finding container 89401e04413f7ef002e13e9715b964c7ae99a764789d37e6d0072d3f0a101764: Status 404 returned error can't find the container with id 89401e04413f7ef002e13e9715b964c7ae99a764789d37e6d0072d3f0a101764 Apr 17 21:21:13.513284 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:13.513245 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" event={"ID":"348290e3-f3af-4dd5-9881-2b6543bd7481","Type":"ContainerStarted","Data":"89401e04413f7ef002e13e9715b964c7ae99a764789d37e6d0072d3f0a101764"} Apr 17 21:21:14.483957 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:14.483921 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-694fdf7c65-95kfc" Apr 17 21:21:15.522594 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:15.522553 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" event={"ID":"348290e3-f3af-4dd5-9881-2b6543bd7481","Type":"ContainerStarted","Data":"3d7073d199adc772f26abb08286ae3e519b10f29f746bde868eebc8a6e686ab4"} Apr 17 21:21:15.522976 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:15.522678 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:15.546157 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:15.546105 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" podStartSLOduration=1.714948865 podStartE2EDuration="3.546090631s" podCreationTimestamp="2026-04-17 21:21:12 +0000 UTC" firstStartedPulling="2026-04-17 21:21:12.917150378 +0000 UTC m=+428.486663123" lastFinishedPulling="2026-04-17 21:21:14.748292154 +0000 UTC m=+430.317804889" observedRunningTime="2026-04-17 21:21:15.543931224 +0000 UTC m=+431.113443991" watchObservedRunningTime="2026-04-17 21:21:15.546090631 +0000 UTC m=+431.115603471" Apr 17 21:21:24.416977 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.416895 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db"] Apr 17 21:21:24.420263 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.420246 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.422852 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.422828 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:21:24.423945 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.423922 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:21:24.424027 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.423923 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-t66c9\"" Apr 17 21:21:24.428703 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.428673 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db"] Apr 17 21:21:24.540541 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.540480 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.540705 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.540634 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jp6h\" (UniqueName: \"kubernetes.io/projected/45515cc4-b5c2-42de-b80e-1be04154a4d2-kube-api-access-8jp6h\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.540705 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.540660 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.641760 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.641724 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jp6h\" (UniqueName: \"kubernetes.io/projected/45515cc4-b5c2-42de-b80e-1be04154a4d2-kube-api-access-8jp6h\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.641928 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.641775 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.641928 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.641827 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.642231 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.642211 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.642267 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.642238 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.655455 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.655424 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jp6h\" (UniqueName: \"kubernetes.io/projected/45515cc4-b5c2-42de-b80e-1be04154a4d2-kube-api-access-8jp6h\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.730827 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.730745 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:24.854207 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:24.854175 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db"] Apr 17 21:21:24.855674 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:21:24.855642 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45515cc4_b5c2_42de_b80e_1be04154a4d2.slice/crio-8da77d2af34708377ebbbdfb0c722818d6a836b99267d3c55c2ebe70188c2dcd WatchSource:0}: Error finding container 8da77d2af34708377ebbbdfb0c722818d6a836b99267d3c55c2ebe70188c2dcd: Status 404 returned error can't find the container with id 8da77d2af34708377ebbbdfb0c722818d6a836b99267d3c55c2ebe70188c2dcd Apr 17 21:21:25.566739 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:25.566700 2567 generic.go:358] "Generic (PLEG): container finished" podID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerID="5413cec4a4a764e7e841f0e8ae90936b0ae459aa82a3e5971ba18f3e86586d84" exitCode=0 Apr 17 21:21:25.567151 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:25.566791 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" event={"ID":"45515cc4-b5c2-42de-b80e-1be04154a4d2","Type":"ContainerDied","Data":"5413cec4a4a764e7e841f0e8ae90936b0ae459aa82a3e5971ba18f3e86586d84"} Apr 17 21:21:25.567151 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:25.566834 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" event={"ID":"45515cc4-b5c2-42de-b80e-1be04154a4d2","Type":"ContainerStarted","Data":"8da77d2af34708377ebbbdfb0c722818d6a836b99267d3c55c2ebe70188c2dcd"} Apr 17 21:21:26.529489 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:26.529456 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-54f8864c6c-4wj92" Apr 17 21:21:26.573311 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:26.573275 2567 generic.go:358] "Generic (PLEG): container finished" podID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerID="849760bdb13f4da4dd848ca396d1845023c219e054269cf0182a517b2471849b" exitCode=0 Apr 17 21:21:26.573717 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:26.573355 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" event={"ID":"45515cc4-b5c2-42de-b80e-1be04154a4d2","Type":"ContainerDied","Data":"849760bdb13f4da4dd848ca396d1845023c219e054269cf0182a517b2471849b"} Apr 17 21:21:27.578807 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:27.578774 2567 generic.go:358] "Generic (PLEG): container finished" podID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerID="7d4d048cf73bee3680d8fc95908c49ef815f533a6a5d5d61234f36669d68390b" exitCode=0 Apr 17 21:21:27.579193 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:27.578867 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" event={"ID":"45515cc4-b5c2-42de-b80e-1be04154a4d2","Type":"ContainerDied","Data":"7d4d048cf73bee3680d8fc95908c49ef815f533a6a5d5d61234f36669d68390b"} Apr 17 21:21:28.707471 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.707442 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:28.879986 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.879956 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jp6h\" (UniqueName: \"kubernetes.io/projected/45515cc4-b5c2-42de-b80e-1be04154a4d2-kube-api-access-8jp6h\") pod \"45515cc4-b5c2-42de-b80e-1be04154a4d2\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " Apr 17 21:21:28.880191 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.880043 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-util\") pod \"45515cc4-b5c2-42de-b80e-1be04154a4d2\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " Apr 17 21:21:28.880191 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.880110 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-bundle\") pod \"45515cc4-b5c2-42de-b80e-1be04154a4d2\" (UID: \"45515cc4-b5c2-42de-b80e-1be04154a4d2\") " Apr 17 21:21:28.880897 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.880870 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-bundle" (OuterVolumeSpecName: "bundle") pod "45515cc4-b5c2-42de-b80e-1be04154a4d2" (UID: "45515cc4-b5c2-42de-b80e-1be04154a4d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:21:28.882179 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.882158 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45515cc4-b5c2-42de-b80e-1be04154a4d2-kube-api-access-8jp6h" (OuterVolumeSpecName: "kube-api-access-8jp6h") pod "45515cc4-b5c2-42de-b80e-1be04154a4d2" (UID: "45515cc4-b5c2-42de-b80e-1be04154a4d2"). InnerVolumeSpecName "kube-api-access-8jp6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:21:28.885736 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.885711 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-util" (OuterVolumeSpecName: "util") pod "45515cc4-b5c2-42de-b80e-1be04154a4d2" (UID: "45515cc4-b5c2-42de-b80e-1be04154a4d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:21:28.981758 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.981724 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:28.981758 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.981753 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8jp6h\" (UniqueName: \"kubernetes.io/projected/45515cc4-b5c2-42de-b80e-1be04154a4d2-kube-api-access-8jp6h\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:28.981758 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:28.981763 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45515cc4-b5c2-42de-b80e-1be04154a4d2-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:29.587928 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:29.587898 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" Apr 17 21:21:29.588130 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:29.587905 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qp2db" event={"ID":"45515cc4-b5c2-42de-b80e-1be04154a4d2","Type":"ContainerDied","Data":"8da77d2af34708377ebbbdfb0c722818d6a836b99267d3c55c2ebe70188c2dcd"} Apr 17 21:21:29.588130 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:29.588009 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8da77d2af34708377ebbbdfb0c722818d6a836b99267d3c55c2ebe70188c2dcd" Apr 17 21:21:38.115032 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.114995 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw"] Apr 17 21:21:38.115403 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.115336 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerName="extract" Apr 17 21:21:38.115403 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.115348 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerName="extract" Apr 17 21:21:38.115403 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.115360 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerName="util" Apr 17 21:21:38.115403 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.115368 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerName="util" Apr 17 21:21:38.115403 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.115383 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerName="pull" Apr 17 21:21:38.115403 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.115390 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerName="pull" Apr 17 21:21:38.115621 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.115440 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="45515cc4-b5c2-42de-b80e-1be04154a4d2" containerName="extract" Apr 17 21:21:38.125138 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.125117 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.128860 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.128835 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-t66c9\"" Apr 17 21:21:38.130002 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.129980 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:21:38.130090 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.129980 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:21:38.146570 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.146543 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw"] Apr 17 21:21:38.159011 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.158984 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.159155 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.159026 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.159200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.159148 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsgmg\" (UniqueName: \"kubernetes.io/projected/8efdeab8-42b0-4107-95de-9573044bd4b7-kube-api-access-zsgmg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.260561 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.260493 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.260757 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.260594 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.260757 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.260746 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsgmg\" (UniqueName: \"kubernetes.io/projected/8efdeab8-42b0-4107-95de-9573044bd4b7-kube-api-access-zsgmg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.260888 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.260871 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.260970 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.260949 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.276734 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.276702 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsgmg\" (UniqueName: \"kubernetes.io/projected/8efdeab8-42b0-4107-95de-9573044bd4b7-kube-api-access-zsgmg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.435392 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.434826 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:38.566976 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.566952 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw"] Apr 17 21:21:38.569496 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:21:38.569465 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8efdeab8_42b0_4107_95de_9573044bd4b7.slice/crio-00b4ae68901dfdf51a2700c469b77c88463ebf31ca44e8f95d127acf3b3d2a4a WatchSource:0}: Error finding container 00b4ae68901dfdf51a2700c469b77c88463ebf31ca44e8f95d127acf3b3d2a4a: Status 404 returned error can't find the container with id 00b4ae68901dfdf51a2700c469b77c88463ebf31ca44e8f95d127acf3b3d2a4a Apr 17 21:21:38.622724 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:38.622693 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" event={"ID":"8efdeab8-42b0-4107-95de-9573044bd4b7","Type":"ContainerStarted","Data":"00b4ae68901dfdf51a2700c469b77c88463ebf31ca44e8f95d127acf3b3d2a4a"} Apr 17 21:21:39.626968 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:39.626930 2567 generic.go:358] "Generic (PLEG): container finished" podID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerID="9eaf24e474b6fe6c2ea98508c4a5588ca7fcebe70e999fa1e7aa8f05fdfeb362" exitCode=0 Apr 17 21:21:39.627356 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:39.626993 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" event={"ID":"8efdeab8-42b0-4107-95de-9573044bd4b7","Type":"ContainerDied","Data":"9eaf24e474b6fe6c2ea98508c4a5588ca7fcebe70e999fa1e7aa8f05fdfeb362"} Apr 17 21:21:45.649575 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:45.649535 2567 generic.go:358] "Generic (PLEG): container finished" podID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerID="532fa5f6a385702cae3871a943d4226b552eace48ef94a13529b3d578381be1f" exitCode=0 Apr 17 21:21:45.649970 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:45.649608 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" event={"ID":"8efdeab8-42b0-4107-95de-9573044bd4b7","Type":"ContainerDied","Data":"532fa5f6a385702cae3871a943d4226b552eace48ef94a13529b3d578381be1f"} Apr 17 21:21:46.655513 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:46.655470 2567 generic.go:358] "Generic (PLEG): container finished" podID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerID="503a68204cde85ed8b6268e14bd56e8943233acb61931287197c6327d4ffef22" exitCode=0 Apr 17 21:21:46.655945 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:46.655562 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" event={"ID":"8efdeab8-42b0-4107-95de-9573044bd4b7","Type":"ContainerDied","Data":"503a68204cde85ed8b6268e14bd56e8943233acb61931287197c6327d4ffef22"} Apr 17 21:21:47.782133 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.782108 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:47.844185 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.844136 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-bundle\") pod \"8efdeab8-42b0-4107-95de-9573044bd4b7\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " Apr 17 21:21:47.844351 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.844211 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsgmg\" (UniqueName: \"kubernetes.io/projected/8efdeab8-42b0-4107-95de-9573044bd4b7-kube-api-access-zsgmg\") pod \"8efdeab8-42b0-4107-95de-9573044bd4b7\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " Apr 17 21:21:47.844351 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.844238 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-util\") pod \"8efdeab8-42b0-4107-95de-9573044bd4b7\" (UID: \"8efdeab8-42b0-4107-95de-9573044bd4b7\") " Apr 17 21:21:47.845108 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.845077 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-bundle" (OuterVolumeSpecName: "bundle") pod "8efdeab8-42b0-4107-95de-9573044bd4b7" (UID: "8efdeab8-42b0-4107-95de-9573044bd4b7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:21:47.846388 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.846366 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8efdeab8-42b0-4107-95de-9573044bd4b7-kube-api-access-zsgmg" (OuterVolumeSpecName: "kube-api-access-zsgmg") pod "8efdeab8-42b0-4107-95de-9573044bd4b7" (UID: "8efdeab8-42b0-4107-95de-9573044bd4b7"). InnerVolumeSpecName "kube-api-access-zsgmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:21:47.848831 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.848810 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-util" (OuterVolumeSpecName: "util") pod "8efdeab8-42b0-4107-95de-9573044bd4b7" (UID: "8efdeab8-42b0-4107-95de-9573044bd4b7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:21:47.945514 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.945406 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:47.945514 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.945446 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsgmg\" (UniqueName: \"kubernetes.io/projected/8efdeab8-42b0-4107-95de-9573044bd4b7-kube-api-access-zsgmg\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:47.945514 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:47.945458 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8efdeab8-42b0-4107-95de-9573044bd4b7-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:21:48.664458 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:48.664426 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" Apr 17 21:21:48.664458 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:48.664436 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2wt4tw" event={"ID":"8efdeab8-42b0-4107-95de-9573044bd4b7","Type":"ContainerDied","Data":"00b4ae68901dfdf51a2700c469b77c88463ebf31ca44e8f95d127acf3b3d2a4a"} Apr 17 21:21:48.664676 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:21:48.664469 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00b4ae68901dfdf51a2700c469b77c88463ebf31ca44e8f95d127acf3b3d2a4a" Apr 17 21:22:01.362870 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.362826 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr"] Apr 17 21:22:01.363379 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.363357 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerName="util" Apr 17 21:22:01.363450 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.363382 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerName="util" Apr 17 21:22:01.363450 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.363399 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerName="extract" Apr 17 21:22:01.363450 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.363408 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerName="extract" Apr 17 21:22:01.363450 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.363423 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerName="pull" Apr 17 21:22:01.363450 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.363432 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerName="pull" Apr 17 21:22:01.363731 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.363558 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="8efdeab8-42b0-4107-95de-9573044bd4b7" containerName="extract" Apr 17 21:22:01.367344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.367317 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.370016 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.369897 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 21:22:01.370016 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.369912 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-4c9cj\"" Apr 17 21:22:01.370201 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.370064 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 21:22:01.370201 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.370163 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 21:22:01.375480 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.375449 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr"] Apr 17 21:22:01.473726 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.473697 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/438ec947-206d-49a2-af2a-f80a0d472353-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.473906 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.473731 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.473906 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.473761 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjfjp\" (UniqueName: \"kubernetes.io/projected/438ec947-206d-49a2-af2a-f80a0d472353-kube-api-access-cjfjp\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.473906 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.473850 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/438ec947-206d-49a2-af2a-f80a0d472353-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.474018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.473913 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.474018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.473961 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/438ec947-206d-49a2-af2a-f80a0d472353-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.474018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.474004 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.474118 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.474066 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.474118 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.474085 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.574672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574635 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/438ec947-206d-49a2-af2a-f80a0d472353-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.574672 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574675 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.574889 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574704 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.574889 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574719 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.574889 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574755 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/438ec947-206d-49a2-af2a-f80a0d472353-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.574889 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574775 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.574889 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574815 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjfjp\" (UniqueName: \"kubernetes.io/projected/438ec947-206d-49a2-af2a-f80a0d472353-kube-api-access-cjfjp\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.575131 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574888 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/438ec947-206d-49a2-af2a-f80a0d472353-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.575131 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.574933 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.575131 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.575122 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.575263 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.575161 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.575355 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.575333 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.575445 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.575426 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.575704 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.575678 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/438ec947-206d-49a2-af2a-f80a0d472353-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.577380 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.577358 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/438ec947-206d-49a2-af2a-f80a0d472353-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.577499 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.577433 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/438ec947-206d-49a2-af2a-f80a0d472353-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.582425 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.582396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/438ec947-206d-49a2-af2a-f80a0d472353-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.582564 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.582513 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjfjp\" (UniqueName: \"kubernetes.io/projected/438ec947-206d-49a2-af2a-f80a0d472353-kube-api-access-cjfjp\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr\" (UID: \"438ec947-206d-49a2-af2a-f80a0d472353\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.680350 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.680261 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:01.809581 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:01.809553 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr"] Apr 17 21:22:01.811707 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:01.811684 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod438ec947_206d_49a2_af2a_f80a0d472353.slice/crio-6acae275d596f11307af4ead753a6605e688a5fdf821880992e7fc08dd75abf0 WatchSource:0}: Error finding container 6acae275d596f11307af4ead753a6605e688a5fdf821880992e7fc08dd75abf0: Status 404 returned error can't find the container with id 6acae275d596f11307af4ead753a6605e688a5fdf821880992e7fc08dd75abf0 Apr 17 21:22:02.719272 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:02.719228 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" event={"ID":"438ec947-206d-49a2-af2a-f80a0d472353","Type":"ContainerStarted","Data":"6acae275d596f11307af4ead753a6605e688a5fdf821880992e7fc08dd75abf0"} Apr 17 21:22:05.672823 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:05.672783 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:22:05.673222 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:05.672871 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:22:05.673222 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:05.672917 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:22:06.741577 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:06.741536 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" event={"ID":"438ec947-206d-49a2-af2a-f80a0d472353","Type":"ContainerStarted","Data":"05b3eb7f6df0b9c610a822fb74e3442c505c03670bf6a1a2c9feb89ce1644a9e"} Apr 17 21:22:06.760973 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:06.760922 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" podStartSLOduration=1.902345048 podStartE2EDuration="5.760902423s" podCreationTimestamp="2026-04-17 21:22:01 +0000 UTC" firstStartedPulling="2026-04-17 21:22:01.8139386 +0000 UTC m=+477.383451331" lastFinishedPulling="2026-04-17 21:22:05.672495964 +0000 UTC m=+481.242008706" observedRunningTime="2026-04-17 21:22:06.759220975 +0000 UTC m=+482.328733757" watchObservedRunningTime="2026-04-17 21:22:06.760902423 +0000 UTC m=+482.330415183" Apr 17 21:22:07.680599 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:07.680561 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:07.685210 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:07.685183 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:07.745556 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:07.745507 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:07.746458 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:07.746439 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr" Apr 17 21:22:18.650695 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.650657 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5dz7l"] Apr 17 21:22:18.655373 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.655348 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" Apr 17 21:22:18.658633 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.658612 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 21:22:18.659202 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.659177 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 21:22:18.659341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.659180 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-wcl5s\"" Apr 17 21:22:18.665172 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.665144 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5dz7l"] Apr 17 21:22:18.715207 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.715169 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc98k\" (UniqueName: \"kubernetes.io/projected/715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1-kube-api-access-tc98k\") pod \"kuadrant-operator-catalog-5dz7l\" (UID: \"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1\") " pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" Apr 17 21:22:18.815804 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.815768 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc98k\" (UniqueName: \"kubernetes.io/projected/715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1-kube-api-access-tc98k\") pod \"kuadrant-operator-catalog-5dz7l\" (UID: \"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1\") " pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" Apr 17 21:22:18.823540 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.823482 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc98k\" (UniqueName: \"kubernetes.io/projected/715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1-kube-api-access-tc98k\") pod \"kuadrant-operator-catalog-5dz7l\" (UID: \"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1\") " pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" Apr 17 21:22:18.966665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:18.966576 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" Apr 17 21:22:19.024509 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.024448 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5dz7l"] Apr 17 21:22:19.109945 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.109920 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5dz7l"] Apr 17 21:22:19.111679 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:19.111649 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod715b6a60_9dfa_4bcf_a9eb_76cc2ce0a1d1.slice/crio-75f9ca592bf0898026e31c8e343cdbe0a696676dcf5c08a26abbc49b8adc3a07 WatchSource:0}: Error finding container 75f9ca592bf0898026e31c8e343cdbe0a696676dcf5c08a26abbc49b8adc3a07: Status 404 returned error can't find the container with id 75f9ca592bf0898026e31c8e343cdbe0a696676dcf5c08a26abbc49b8adc3a07 Apr 17 21:22:19.229874 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.229791 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hx8gd"] Apr 17 21:22:19.234505 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.234488 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:19.240093 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.240067 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hx8gd"] Apr 17 21:22:19.319600 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.319565 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-865b4\" (UniqueName: \"kubernetes.io/projected/0374bd95-cb75-4964-be0a-c883b7f390e7-kube-api-access-865b4\") pod \"kuadrant-operator-catalog-hx8gd\" (UID: \"0374bd95-cb75-4964-be0a-c883b7f390e7\") " pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:19.420841 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.420800 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-865b4\" (UniqueName: \"kubernetes.io/projected/0374bd95-cb75-4964-be0a-c883b7f390e7-kube-api-access-865b4\") pod \"kuadrant-operator-catalog-hx8gd\" (UID: \"0374bd95-cb75-4964-be0a-c883b7f390e7\") " pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:19.430618 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.430585 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-865b4\" (UniqueName: \"kubernetes.io/projected/0374bd95-cb75-4964-be0a-c883b7f390e7-kube-api-access-865b4\") pod \"kuadrant-operator-catalog-hx8gd\" (UID: \"0374bd95-cb75-4964-be0a-c883b7f390e7\") " pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:19.545272 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.545190 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:19.672982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.672950 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-hx8gd"] Apr 17 21:22:19.674195 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:19.674168 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0374bd95_cb75_4964_be0a_c883b7f390e7.slice/crio-64bb38a7b946dab9a59037ec22ad94ca476465d98ee995f1942a472824f7140d WatchSource:0}: Error finding container 64bb38a7b946dab9a59037ec22ad94ca476465d98ee995f1942a472824f7140d: Status 404 returned error can't find the container with id 64bb38a7b946dab9a59037ec22ad94ca476465d98ee995f1942a472824f7140d Apr 17 21:22:19.792702 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.792664 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" event={"ID":"0374bd95-cb75-4964-be0a-c883b7f390e7","Type":"ContainerStarted","Data":"64bb38a7b946dab9a59037ec22ad94ca476465d98ee995f1942a472824f7140d"} Apr 17 21:22:19.793690 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:19.793663 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" event={"ID":"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1","Type":"ContainerStarted","Data":"75f9ca592bf0898026e31c8e343cdbe0a696676dcf5c08a26abbc49b8adc3a07"} Apr 17 21:22:21.803969 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:21.803924 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" event={"ID":"0374bd95-cb75-4964-be0a-c883b7f390e7","Type":"ContainerStarted","Data":"71b3648661a59dc1e8c5fd1c573655203d5885fd2e80f7fa656147cb6b9e455f"} Apr 17 21:22:21.805321 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:21.805294 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" event={"ID":"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1","Type":"ContainerStarted","Data":"2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f"} Apr 17 21:22:21.805447 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:21.805379 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" podUID="715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1" containerName="registry-server" containerID="cri-o://2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f" gracePeriod=2 Apr 17 21:22:21.821147 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:21.821105 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" podStartSLOduration=1.191076463 podStartE2EDuration="2.821088061s" podCreationTimestamp="2026-04-17 21:22:19 +0000 UTC" firstStartedPulling="2026-04-17 21:22:19.675626533 +0000 UTC m=+495.245139264" lastFinishedPulling="2026-04-17 21:22:21.30563813 +0000 UTC m=+496.875150862" observedRunningTime="2026-04-17 21:22:21.818038179 +0000 UTC m=+497.387550944" watchObservedRunningTime="2026-04-17 21:22:21.821088061 +0000 UTC m=+497.390600813" Apr 17 21:22:21.834102 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:21.834053 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" podStartSLOduration=1.644656852 podStartE2EDuration="3.834039752s" podCreationTimestamp="2026-04-17 21:22:18 +0000 UTC" firstStartedPulling="2026-04-17 21:22:19.113088123 +0000 UTC m=+494.682600854" lastFinishedPulling="2026-04-17 21:22:21.30247101 +0000 UTC m=+496.871983754" observedRunningTime="2026-04-17 21:22:21.832031225 +0000 UTC m=+497.401543978" watchObservedRunningTime="2026-04-17 21:22:21.834039752 +0000 UTC m=+497.403552504" Apr 17 21:22:22.046861 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.046833 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" Apr 17 21:22:22.147507 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.147470 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc98k\" (UniqueName: \"kubernetes.io/projected/715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1-kube-api-access-tc98k\") pod \"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1\" (UID: \"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1\") " Apr 17 21:22:22.149774 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.149744 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1-kube-api-access-tc98k" (OuterVolumeSpecName: "kube-api-access-tc98k") pod "715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1" (UID: "715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1"). InnerVolumeSpecName "kube-api-access-tc98k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:22:22.248420 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.248377 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tc98k\" (UniqueName: \"kubernetes.io/projected/715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1-kube-api-access-tc98k\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:22.809803 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.809765 2567 generic.go:358] "Generic (PLEG): container finished" podID="715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1" containerID="2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f" exitCode=0 Apr 17 21:22:22.810276 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.809825 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" Apr 17 21:22:22.810276 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.809856 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" event={"ID":"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1","Type":"ContainerDied","Data":"2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f"} Apr 17 21:22:22.810276 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.809904 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-5dz7l" event={"ID":"715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1","Type":"ContainerDied","Data":"75f9ca592bf0898026e31c8e343cdbe0a696676dcf5c08a26abbc49b8adc3a07"} Apr 17 21:22:22.810276 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.809929 2567 scope.go:117] "RemoveContainer" containerID="2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f" Apr 17 21:22:22.819301 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.819283 2567 scope.go:117] "RemoveContainer" containerID="2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f" Apr 17 21:22:22.819594 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:22:22.819575 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f\": container with ID starting with 2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f not found: ID does not exist" containerID="2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f" Apr 17 21:22:22.819656 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.819603 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f"} err="failed to get container status \"2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f\": rpc error: code = NotFound desc = could not find container \"2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f\": container with ID starting with 2f0e755f9d5304fadfabd9ed05331c12e645354c995e57a7aba4bde24ccdac0f not found: ID does not exist" Apr 17 21:22:22.831076 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.831049 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5dz7l"] Apr 17 21:22:22.832935 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:22.832917 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5dz7l"] Apr 17 21:22:23.055854 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:23.055819 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1" path="/var/lib/kubelet/pods/715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1/volumes" Apr 17 21:22:29.545435 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:29.545393 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:29.545938 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:29.545547 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:29.566985 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:29.566959 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:29.856675 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:29.856645 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-hx8gd" Apr 17 21:22:33.861210 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.861172 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6"] Apr 17 21:22:33.861625 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.861572 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1" containerName="registry-server" Apr 17 21:22:33.861625 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.861585 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1" containerName="registry-server" Apr 17 21:22:33.861699 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.861640 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="715b6a60-9dfa-4bcf-a9eb-76cc2ce0a1d1" containerName="registry-server" Apr 17 21:22:33.864820 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.864803 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:33.867361 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.867344 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cdvnh\"" Apr 17 21:22:33.872663 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.872548 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6"] Apr 17 21:22:33.948091 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.948052 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqt5q\" (UniqueName: \"kubernetes.io/projected/254ffd77-476c-48c4-88ff-2f89b0847e9c-kube-api-access-sqt5q\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:33.948266 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.948097 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:33.948266 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:33.948129 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:34.048928 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.048886 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqt5q\" (UniqueName: \"kubernetes.io/projected/254ffd77-476c-48c4-88ff-2f89b0847e9c-kube-api-access-sqt5q\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:34.048928 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.048934 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:34.049135 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.048961 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:34.049325 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.049311 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:34.049369 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.049348 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:34.057026 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.057001 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqt5q\" (UniqueName: \"kubernetes.io/projected/254ffd77-476c-48c4-88ff-2f89b0847e9c-kube-api-access-sqt5q\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:34.175735 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.175635 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:34.260713 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.260671 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v"] Apr 17 21:22:34.266104 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.266079 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.272064 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.272035 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v"] Apr 17 21:22:34.305758 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.305733 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6"] Apr 17 21:22:34.308129 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:34.308099 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod254ffd77_476c_48c4_88ff_2f89b0847e9c.slice/crio-edef390ff4a7412789e293046bae972c3c44a497743fec8182b5712cad7c3434 WatchSource:0}: Error finding container edef390ff4a7412789e293046bae972c3c44a497743fec8182b5712cad7c3434: Status 404 returned error can't find the container with id edef390ff4a7412789e293046bae972c3c44a497743fec8182b5712cad7c3434 Apr 17 21:22:34.351513 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.351473 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxs7b\" (UniqueName: \"kubernetes.io/projected/4b07c673-6251-43f9-9720-fcb0e598d9b6-kube-api-access-sxs7b\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.351665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.351548 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.351665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.351587 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.452393 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.452361 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxs7b\" (UniqueName: \"kubernetes.io/projected/4b07c673-6251-43f9-9720-fcb0e598d9b6-kube-api-access-sxs7b\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.452570 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.452414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.452570 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.452456 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.452880 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.452859 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.452913 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.452871 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.460029 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.460008 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxs7b\" (UniqueName: \"kubernetes.io/projected/4b07c673-6251-43f9-9720-fcb0e598d9b6-kube-api-access-sxs7b\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.580780 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.580744 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:34.704860 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.704832 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v"] Apr 17 21:22:34.706029 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:34.706003 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b07c673_6251_43f9_9720_fcb0e598d9b6.slice/crio-05f29ca2893fa30cbba17b455acfc0304178923c0058a454c29dda26af378b24 WatchSource:0}: Error finding container 05f29ca2893fa30cbba17b455acfc0304178923c0058a454c29dda26af378b24: Status 404 returned error can't find the container with id 05f29ca2893fa30cbba17b455acfc0304178923c0058a454c29dda26af378b24 Apr 17 21:22:34.855639 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.855601 2567 generic.go:358] "Generic (PLEG): container finished" podID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerID="8559f419b2f640e47c7201c3b9ffd31729d63f160c4fae34e07a608dbf6918ef" exitCode=0 Apr 17 21:22:34.855834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.855694 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" event={"ID":"254ffd77-476c-48c4-88ff-2f89b0847e9c","Type":"ContainerDied","Data":"8559f419b2f640e47c7201c3b9ffd31729d63f160c4fae34e07a608dbf6918ef"} Apr 17 21:22:34.855834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.855728 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" event={"ID":"254ffd77-476c-48c4-88ff-2f89b0847e9c","Type":"ContainerStarted","Data":"edef390ff4a7412789e293046bae972c3c44a497743fec8182b5712cad7c3434"} Apr 17 21:22:34.857203 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.857180 2567 generic.go:358] "Generic (PLEG): container finished" podID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerID="260b5fde15779b35fd1d3b922abf1a88bda1e48083c7c141ac4754df9d2173fe" exitCode=0 Apr 17 21:22:34.857307 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.857214 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" event={"ID":"4b07c673-6251-43f9-9720-fcb0e598d9b6","Type":"ContainerDied","Data":"260b5fde15779b35fd1d3b922abf1a88bda1e48083c7c141ac4754df9d2173fe"} Apr 17 21:22:34.857307 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.857237 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" event={"ID":"4b07c673-6251-43f9-9720-fcb0e598d9b6","Type":"ContainerStarted","Data":"05f29ca2893fa30cbba17b455acfc0304178923c0058a454c29dda26af378b24"} Apr 17 21:22:34.862760 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.862739 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq"] Apr 17 21:22:34.866352 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.866336 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:34.874613 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.874588 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq"] Apr 17 21:22:34.957163 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.957069 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:34.957322 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.957216 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d5nz\" (UniqueName: \"kubernetes.io/projected/c2358ee3-70da-40bd-a51a-ccce72501936-kube-api-access-2d5nz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:34.957322 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:34.957285 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:35.058076 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.058045 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:35.058247 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.058123 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d5nz\" (UniqueName: \"kubernetes.io/projected/c2358ee3-70da-40bd-a51a-ccce72501936-kube-api-access-2d5nz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:35.058247 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.058162 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:35.058409 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.058389 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:35.058469 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.058419 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:35.066532 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.066485 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d5nz\" (UniqueName: \"kubernetes.io/projected/c2358ee3-70da-40bd-a51a-ccce72501936-kube-api-access-2d5nz\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:35.177119 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.177074 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:35.307837 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.307793 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq"] Apr 17 21:22:35.309198 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:35.309167 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2358ee3_70da_40bd_a51a_ccce72501936.slice/crio-9d533c3fe59540b2c7f98ea65acdf569e56813ff3f2be61623046da444f55661 WatchSource:0}: Error finding container 9d533c3fe59540b2c7f98ea65acdf569e56813ff3f2be61623046da444f55661: Status 404 returned error can't find the container with id 9d533c3fe59540b2c7f98ea65acdf569e56813ff3f2be61623046da444f55661 Apr 17 21:22:35.461796 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.461761 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8"] Apr 17 21:22:35.466171 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.466148 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.473016 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.472990 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8"] Apr 17 21:22:35.563638 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.563553 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.563638 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.563600 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqr5k\" (UniqueName: \"kubernetes.io/projected/1f396d01-3669-4f17-bdd9-3cd350725333-kube-api-access-rqr5k\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.563818 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.563670 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.665187 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.665154 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.665362 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.665200 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqr5k\" (UniqueName: \"kubernetes.io/projected/1f396d01-3669-4f17-bdd9-3cd350725333-kube-api-access-rqr5k\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.665362 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.665220 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.665659 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.665633 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.665724 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.665700 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.673698 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.673675 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqr5k\" (UniqueName: \"kubernetes.io/projected/1f396d01-3669-4f17-bdd9-3cd350725333-kube-api-access-rqr5k\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.777065 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.777029 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:35.863143 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.863105 2567 generic.go:358] "Generic (PLEG): container finished" podID="c2358ee3-70da-40bd-a51a-ccce72501936" containerID="d3909dffdeb953b31d45910cd537abdc3732aa8d0cc6d7ebd05ef2e878134119" exitCode=0 Apr 17 21:22:35.863564 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.863273 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" event={"ID":"c2358ee3-70da-40bd-a51a-ccce72501936","Type":"ContainerDied","Data":"d3909dffdeb953b31d45910cd537abdc3732aa8d0cc6d7ebd05ef2e878134119"} Apr 17 21:22:35.863564 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.863307 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" event={"ID":"c2358ee3-70da-40bd-a51a-ccce72501936","Type":"ContainerStarted","Data":"9d533c3fe59540b2c7f98ea65acdf569e56813ff3f2be61623046da444f55661"} Apr 17 21:22:35.865934 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.865826 2567 generic.go:358] "Generic (PLEG): container finished" podID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerID="b1624c216fa5f639b7f0cb5284655e03cb04c595564f048f690948e6b0aff94d" exitCode=0 Apr 17 21:22:35.866085 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.866056 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" event={"ID":"4b07c673-6251-43f9-9720-fcb0e598d9b6","Type":"ContainerDied","Data":"b1624c216fa5f639b7f0cb5284655e03cb04c595564f048f690948e6b0aff94d"} Apr 17 21:22:35.869082 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.869060 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" event={"ID":"254ffd77-476c-48c4-88ff-2f89b0847e9c","Type":"ContainerStarted","Data":"2e867fbafee707800651ed3cc65c3c312a88a673a459d9a8a173e8888da0255b"} Apr 17 21:22:35.913007 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:35.912443 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8"] Apr 17 21:22:35.913007 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:35.912936 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f396d01_3669_4f17_bdd9_3cd350725333.slice/crio-f1d34b659227aa397bf3466464f417c2e66e3a59b625a1d73505c65a4b4e524b WatchSource:0}: Error finding container f1d34b659227aa397bf3466464f417c2e66e3a59b625a1d73505c65a4b4e524b: Status 404 returned error can't find the container with id f1d34b659227aa397bf3466464f417c2e66e3a59b625a1d73505c65a4b4e524b Apr 17 21:22:36.875121 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.875030 2567 generic.go:358] "Generic (PLEG): container finished" podID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerID="9d3506e5ea43c5b146de929001126d67fe8ab5bca4b1209955c159e4adac78ff" exitCode=0 Apr 17 21:22:36.875482 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.875114 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" event={"ID":"4b07c673-6251-43f9-9720-fcb0e598d9b6","Type":"ContainerDied","Data":"9d3506e5ea43c5b146de929001126d67fe8ab5bca4b1209955c159e4adac78ff"} Apr 17 21:22:36.876988 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.876934 2567 generic.go:358] "Generic (PLEG): container finished" podID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerID="2e867fbafee707800651ed3cc65c3c312a88a673a459d9a8a173e8888da0255b" exitCode=0 Apr 17 21:22:36.876988 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.876954 2567 generic.go:358] "Generic (PLEG): container finished" podID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerID="5b87d8ae8db8a6fce00e7df7f87c9d9e7dd766a77fc986a6465baf1e8bcb93f8" exitCode=0 Apr 17 21:22:36.876988 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.876970 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" event={"ID":"254ffd77-476c-48c4-88ff-2f89b0847e9c","Type":"ContainerDied","Data":"2e867fbafee707800651ed3cc65c3c312a88a673a459d9a8a173e8888da0255b"} Apr 17 21:22:36.877194 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.877003 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" event={"ID":"254ffd77-476c-48c4-88ff-2f89b0847e9c","Type":"ContainerDied","Data":"5b87d8ae8db8a6fce00e7df7f87c9d9e7dd766a77fc986a6465baf1e8bcb93f8"} Apr 17 21:22:36.878553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.878504 2567 generic.go:358] "Generic (PLEG): container finished" podID="1f396d01-3669-4f17-bdd9-3cd350725333" containerID="a2e22b2ff1d86efd16ab8a6d335894ec1e584580c5e36775b5fa0bc462c1b800" exitCode=0 Apr 17 21:22:36.878553 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.878546 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" event={"ID":"1f396d01-3669-4f17-bdd9-3cd350725333","Type":"ContainerDied","Data":"a2e22b2ff1d86efd16ab8a6d335894ec1e584580c5e36775b5fa0bc462c1b800"} Apr 17 21:22:36.878686 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.878572 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" event={"ID":"1f396d01-3669-4f17-bdd9-3cd350725333","Type":"ContainerStarted","Data":"f1d34b659227aa397bf3466464f417c2e66e3a59b625a1d73505c65a4b4e524b"} Apr 17 21:22:36.880162 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.880139 2567 generic.go:358] "Generic (PLEG): container finished" podID="c2358ee3-70da-40bd-a51a-ccce72501936" containerID="ac41a9666c882cfd2c06c6394d80cd68b56103510c3bfc6f688915a478f23447" exitCode=0 Apr 17 21:22:36.880245 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:36.880215 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" event={"ID":"c2358ee3-70da-40bd-a51a-ccce72501936","Type":"ContainerDied","Data":"ac41a9666c882cfd2c06c6394d80cd68b56103510c3bfc6f688915a478f23447"} Apr 17 21:22:37.886190 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:37.886097 2567 generic.go:358] "Generic (PLEG): container finished" podID="c2358ee3-70da-40bd-a51a-ccce72501936" containerID="b7b13f7264abc198e5ecb3b0e9862c21a50f9ab5315f2a6badfa6d2352dda033" exitCode=0 Apr 17 21:22:37.886636 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:37.886181 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" event={"ID":"c2358ee3-70da-40bd-a51a-ccce72501936","Type":"ContainerDied","Data":"b7b13f7264abc198e5ecb3b0e9862c21a50f9ab5315f2a6badfa6d2352dda033"} Apr 17 21:22:37.887732 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:37.887710 2567 generic.go:358] "Generic (PLEG): container finished" podID="1f396d01-3669-4f17-bdd9-3cd350725333" containerID="9bac3e3bbb4f01ae629171e5793ad1484c7e5974d5f0639b104fe55209b09d44" exitCode=0 Apr 17 21:22:37.887821 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:37.887798 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" event={"ID":"1f396d01-3669-4f17-bdd9-3cd350725333","Type":"ContainerDied","Data":"9bac3e3bbb4f01ae629171e5793ad1484c7e5974d5f0639b104fe55209b09d44"} Apr 17 21:22:38.029741 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.029710 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:38.042654 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.042629 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:38.087016 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.086987 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-util\") pod \"4b07c673-6251-43f9-9720-fcb0e598d9b6\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " Apr 17 21:22:38.087194 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.087040 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqt5q\" (UniqueName: \"kubernetes.io/projected/254ffd77-476c-48c4-88ff-2f89b0847e9c-kube-api-access-sqt5q\") pod \"254ffd77-476c-48c4-88ff-2f89b0847e9c\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " Apr 17 21:22:38.087194 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.087078 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxs7b\" (UniqueName: \"kubernetes.io/projected/4b07c673-6251-43f9-9720-fcb0e598d9b6-kube-api-access-sxs7b\") pod \"4b07c673-6251-43f9-9720-fcb0e598d9b6\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " Apr 17 21:22:38.087194 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.087102 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-util\") pod \"254ffd77-476c-48c4-88ff-2f89b0847e9c\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " Apr 17 21:22:38.087194 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.087190 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-bundle\") pod \"4b07c673-6251-43f9-9720-fcb0e598d9b6\" (UID: \"4b07c673-6251-43f9-9720-fcb0e598d9b6\") " Apr 17 21:22:38.087408 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.087230 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-bundle\") pod \"254ffd77-476c-48c4-88ff-2f89b0847e9c\" (UID: \"254ffd77-476c-48c4-88ff-2f89b0847e9c\") " Apr 17 21:22:38.087902 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.087873 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-bundle" (OuterVolumeSpecName: "bundle") pod "254ffd77-476c-48c4-88ff-2f89b0847e9c" (UID: "254ffd77-476c-48c4-88ff-2f89b0847e9c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:22:38.088169 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.088133 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-bundle" (OuterVolumeSpecName: "bundle") pod "4b07c673-6251-43f9-9720-fcb0e598d9b6" (UID: "4b07c673-6251-43f9-9720-fcb0e598d9b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:22:38.089750 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.089727 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254ffd77-476c-48c4-88ff-2f89b0847e9c-kube-api-access-sqt5q" (OuterVolumeSpecName: "kube-api-access-sqt5q") pod "254ffd77-476c-48c4-88ff-2f89b0847e9c" (UID: "254ffd77-476c-48c4-88ff-2f89b0847e9c"). InnerVolumeSpecName "kube-api-access-sqt5q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:22:38.089829 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.089761 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b07c673-6251-43f9-9720-fcb0e598d9b6-kube-api-access-sxs7b" (OuterVolumeSpecName: "kube-api-access-sxs7b") pod "4b07c673-6251-43f9-9720-fcb0e598d9b6" (UID: "4b07c673-6251-43f9-9720-fcb0e598d9b6"). InnerVolumeSpecName "kube-api-access-sxs7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:22:38.092225 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.092205 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-util" (OuterVolumeSpecName: "util") pod "4b07c673-6251-43f9-9720-fcb0e598d9b6" (UID: "4b07c673-6251-43f9-9720-fcb0e598d9b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:22:38.092928 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.092906 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-util" (OuterVolumeSpecName: "util") pod "254ffd77-476c-48c4-88ff-2f89b0847e9c" (UID: "254ffd77-476c-48c4-88ff-2f89b0847e9c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:22:38.188550 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.188433 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:38.188550 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.188465 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sqt5q\" (UniqueName: \"kubernetes.io/projected/254ffd77-476c-48c4-88ff-2f89b0847e9c-kube-api-access-sqt5q\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:38.188550 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.188475 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sxs7b\" (UniqueName: \"kubernetes.io/projected/4b07c673-6251-43f9-9720-fcb0e598d9b6-kube-api-access-sxs7b\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:38.188550 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.188485 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:38.188550 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.188495 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b07c673-6251-43f9-9720-fcb0e598d9b6-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:38.188550 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.188505 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/254ffd77-476c-48c4-88ff-2f89b0847e9c-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:38.894442 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.894345 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" event={"ID":"4b07c673-6251-43f9-9720-fcb0e598d9b6","Type":"ContainerDied","Data":"05f29ca2893fa30cbba17b455acfc0304178923c0058a454c29dda26af378b24"} Apr 17 21:22:38.894442 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.894389 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05f29ca2893fa30cbba17b455acfc0304178923c0058a454c29dda26af378b24" Apr 17 21:22:38.894442 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.894364 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v" Apr 17 21:22:38.896113 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.896091 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" event={"ID":"254ffd77-476c-48c4-88ff-2f89b0847e9c","Type":"ContainerDied","Data":"edef390ff4a7412789e293046bae972c3c44a497743fec8182b5712cad7c3434"} Apr 17 21:22:38.896221 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.896117 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edef390ff4a7412789e293046bae972c3c44a497743fec8182b5712cad7c3434" Apr 17 21:22:38.896221 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.896122 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6" Apr 17 21:22:38.898189 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.898165 2567 generic.go:358] "Generic (PLEG): container finished" podID="1f396d01-3669-4f17-bdd9-3cd350725333" containerID="8439a80f8666dbe770ae83aeae0b55d6ae096d42c085e6923d1d7b8c88bf839a" exitCode=0 Apr 17 21:22:38.898307 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:38.898237 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" event={"ID":"1f396d01-3669-4f17-bdd9-3cd350725333","Type":"ContainerDied","Data":"8439a80f8666dbe770ae83aeae0b55d6ae096d42c085e6923d1d7b8c88bf839a"} Apr 17 21:22:39.032037 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.032013 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:39.094223 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.094186 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-bundle\") pod \"c2358ee3-70da-40bd-a51a-ccce72501936\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " Apr 17 21:22:39.094404 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.094285 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-util\") pod \"c2358ee3-70da-40bd-a51a-ccce72501936\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " Apr 17 21:22:39.094404 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.094310 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d5nz\" (UniqueName: \"kubernetes.io/projected/c2358ee3-70da-40bd-a51a-ccce72501936-kube-api-access-2d5nz\") pod \"c2358ee3-70da-40bd-a51a-ccce72501936\" (UID: \"c2358ee3-70da-40bd-a51a-ccce72501936\") " Apr 17 21:22:39.094800 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.094765 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-bundle" (OuterVolumeSpecName: "bundle") pod "c2358ee3-70da-40bd-a51a-ccce72501936" (UID: "c2358ee3-70da-40bd-a51a-ccce72501936"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:22:39.096471 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.096445 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2358ee3-70da-40bd-a51a-ccce72501936-kube-api-access-2d5nz" (OuterVolumeSpecName: "kube-api-access-2d5nz") pod "c2358ee3-70da-40bd-a51a-ccce72501936" (UID: "c2358ee3-70da-40bd-a51a-ccce72501936"). InnerVolumeSpecName "kube-api-access-2d5nz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:22:39.099711 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.099685 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-util" (OuterVolumeSpecName: "util") pod "c2358ee3-70da-40bd-a51a-ccce72501936" (UID: "c2358ee3-70da-40bd-a51a-ccce72501936"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:22:39.195616 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.195499 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:39.195616 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.195558 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2358ee3-70da-40bd-a51a-ccce72501936-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:39.195616 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.195570 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2d5nz\" (UniqueName: \"kubernetes.io/projected/c2358ee3-70da-40bd-a51a-ccce72501936-kube-api-access-2d5nz\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:39.904251 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.904220 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" Apr 17 21:22:39.904251 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.904242 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq" event={"ID":"c2358ee3-70da-40bd-a51a-ccce72501936","Type":"ContainerDied","Data":"9d533c3fe59540b2c7f98ea65acdf569e56813ff3f2be61623046da444f55661"} Apr 17 21:22:39.904786 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:39.904275 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d533c3fe59540b2c7f98ea65acdf569e56813ff3f2be61623046da444f55661" Apr 17 21:22:40.029790 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.029763 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:40.103631 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.103595 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-util\") pod \"1f396d01-3669-4f17-bdd9-3cd350725333\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " Apr 17 21:22:40.103798 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.103647 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqr5k\" (UniqueName: \"kubernetes.io/projected/1f396d01-3669-4f17-bdd9-3cd350725333-kube-api-access-rqr5k\") pod \"1f396d01-3669-4f17-bdd9-3cd350725333\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " Apr 17 21:22:40.103798 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.103709 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-bundle\") pod \"1f396d01-3669-4f17-bdd9-3cd350725333\" (UID: \"1f396d01-3669-4f17-bdd9-3cd350725333\") " Apr 17 21:22:40.104335 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.104299 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-bundle" (OuterVolumeSpecName: "bundle") pod "1f396d01-3669-4f17-bdd9-3cd350725333" (UID: "1f396d01-3669-4f17-bdd9-3cd350725333"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:22:40.105796 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.105772 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f396d01-3669-4f17-bdd9-3cd350725333-kube-api-access-rqr5k" (OuterVolumeSpecName: "kube-api-access-rqr5k") pod "1f396d01-3669-4f17-bdd9-3cd350725333" (UID: "1f396d01-3669-4f17-bdd9-3cd350725333"). InnerVolumeSpecName "kube-api-access-rqr5k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:22:40.109666 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.109627 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-util" (OuterVolumeSpecName: "util") pod "1f396d01-3669-4f17-bdd9-3cd350725333" (UID: "1f396d01-3669-4f17-bdd9-3cd350725333"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:22:40.205018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.204910 2567 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-util\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:40.205018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.204962 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqr5k\" (UniqueName: \"kubernetes.io/projected/1f396d01-3669-4f17-bdd9-3cd350725333-kube-api-access-rqr5k\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:40.205018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.204973 2567 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f396d01-3669-4f17-bdd9-3cd350725333-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:22:40.909751 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.909715 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" Apr 17 21:22:40.909751 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.909722 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8" event={"ID":"1f396d01-3669-4f17-bdd9-3cd350725333","Type":"ContainerDied","Data":"f1d34b659227aa397bf3466464f417c2e66e3a59b625a1d73505c65a4b4e524b"} Apr 17 21:22:40.909751 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:40.909756 2567 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d34b659227aa397bf3466464f417c2e66e3a59b625a1d73505c65a4b4e524b" Apr 17 21:22:51.899877 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.899843 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b9fdff79-95ddb"] Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900191 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2358ee3-70da-40bd-a51a-ccce72501936" containerName="util" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900202 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2358ee3-70da-40bd-a51a-ccce72501936" containerName="util" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900211 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerName="util" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900216 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerName="util" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900224 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f396d01-3669-4f17-bdd9-3cd350725333" containerName="extract" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900233 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f396d01-3669-4f17-bdd9-3cd350725333" containerName="extract" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900243 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2358ee3-70da-40bd-a51a-ccce72501936" containerName="pull" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900248 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2358ee3-70da-40bd-a51a-ccce72501936" containerName="pull" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900257 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerName="pull" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900262 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerName="pull" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900269 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerName="extract" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900273 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerName="extract" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900316 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f396d01-3669-4f17-bdd9-3cd350725333" containerName="util" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900322 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f396d01-3669-4f17-bdd9-3cd350725333" containerName="util" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900327 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f396d01-3669-4f17-bdd9-3cd350725333" containerName="pull" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900333 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f396d01-3669-4f17-bdd9-3cd350725333" containerName="pull" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900342 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerName="pull" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900349 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerName="pull" Apr 17 21:22:51.900344 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900355 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerName="extract" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900361 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerName="extract" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900367 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2358ee3-70da-40bd-a51a-ccce72501936" containerName="extract" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900373 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2358ee3-70da-40bd-a51a-ccce72501936" containerName="extract" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900381 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerName="util" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900385 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerName="util" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900484 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b07c673-6251-43f9-9720-fcb0e598d9b6" containerName="extract" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900492 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2358ee3-70da-40bd-a51a-ccce72501936" containerName="extract" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900502 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="254ffd77-476c-48c4-88ff-2f89b0847e9c" containerName="extract" Apr 17 21:22:51.901200 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.900508 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f396d01-3669-4f17-bdd9-3cd350725333" containerName="extract" Apr 17 21:22:51.907329 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.907305 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:51.911862 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:51.911836 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b9fdff79-95ddb"] Apr 17 21:22:52.014780 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.014737 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9mdh\" (UniqueName: \"kubernetes.io/projected/85c07886-1c15-4979-8e03-c7c45b836fa5-kube-api-access-v9mdh\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.014780 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.014786 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-service-ca\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.015015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.014902 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85c07886-1c15-4979-8e03-c7c45b836fa5-console-oauth-config\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.015015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.014942 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-console-config\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.015015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.014987 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-oauth-serving-cert\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.015015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.015008 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85c07886-1c15-4979-8e03-c7c45b836fa5-console-serving-cert\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.015162 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.015024 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-trusted-ca-bundle\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.116451 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.116414 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9mdh\" (UniqueName: \"kubernetes.io/projected/85c07886-1c15-4979-8e03-c7c45b836fa5-kube-api-access-v9mdh\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.116451 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.116452 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-service-ca\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.116742 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.116503 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85c07886-1c15-4979-8e03-c7c45b836fa5-console-oauth-config\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.116742 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.116552 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-console-config\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.116742 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.116597 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-oauth-serving-cert\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.116742 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.116617 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85c07886-1c15-4979-8e03-c7c45b836fa5-console-serving-cert\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.116742 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.116634 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-trusted-ca-bundle\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.117285 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.117257 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-service-ca\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.117389 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.117372 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-oauth-serving-cert\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.117487 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.117472 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-trusted-ca-bundle\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.117552 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.117481 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85c07886-1c15-4979-8e03-c7c45b836fa5-console-config\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.119064 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.119044 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85c07886-1c15-4979-8e03-c7c45b836fa5-console-serving-cert\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.119153 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.119071 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85c07886-1c15-4979-8e03-c7c45b836fa5-console-oauth-config\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.125266 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.125244 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9mdh\" (UniqueName: \"kubernetes.io/projected/85c07886-1c15-4979-8e03-c7c45b836fa5-kube-api-access-v9mdh\") pod \"console-6b9fdff79-95ddb\" (UID: \"85c07886-1c15-4979-8e03-c7c45b836fa5\") " pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.219100 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.219006 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:22:52.355085 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.355053 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b9fdff79-95ddb"] Apr 17 21:22:52.356121 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:52.356097 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c07886_1c15_4979_8e03_c7c45b836fa5.slice/crio-d1c17aa97559f6c92627d09db25c549774d3e008e05a7fca4d0f3cbc14e4b925 WatchSource:0}: Error finding container d1c17aa97559f6c92627d09db25c549774d3e008e05a7fca4d0f3cbc14e4b925: Status 404 returned error can't find the container with id d1c17aa97559f6c92627d09db25c549774d3e008e05a7fca4d0f3cbc14e4b925 Apr 17 21:22:52.964758 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.964718 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b9fdff79-95ddb" event={"ID":"85c07886-1c15-4979-8e03-c7c45b836fa5","Type":"ContainerStarted","Data":"1bc8355080c52aaa8e8b157cd9a617d5c7b2f3f26f9bfd42af00561c6e82a23d"} Apr 17 21:22:52.964758 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.964764 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b9fdff79-95ddb" event={"ID":"85c07886-1c15-4979-8e03-c7c45b836fa5","Type":"ContainerStarted","Data":"d1c17aa97559f6c92627d09db25c549774d3e008e05a7fca4d0f3cbc14e4b925"} Apr 17 21:22:52.982693 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:52.982644 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b9fdff79-95ddb" podStartSLOduration=1.9826286290000001 podStartE2EDuration="1.982628629s" podCreationTimestamp="2026-04-17 21:22:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:22:52.981029686 +0000 UTC m=+528.550542440" watchObservedRunningTime="2026-04-17 21:22:52.982628629 +0000 UTC m=+528.552141382" Apr 17 21:22:55.515306 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.515218 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t"] Apr 17 21:22:55.518730 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.518706 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:22:55.521266 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.521243 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-589nh\"" Apr 17 21:22:55.529125 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.529096 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t"] Apr 17 21:22:55.650950 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.650899 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f957663-f73a-4111-98c3-58aca8336471-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" (UID: \"9f957663-f73a-4111-98c3-58aca8336471\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:22:55.651128 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.650959 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc7nb\" (UniqueName: \"kubernetes.io/projected/9f957663-f73a-4111-98c3-58aca8336471-kube-api-access-rc7nb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" (UID: \"9f957663-f73a-4111-98c3-58aca8336471\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:22:55.752252 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.752202 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f957663-f73a-4111-98c3-58aca8336471-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" (UID: \"9f957663-f73a-4111-98c3-58aca8336471\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:22:55.752252 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.752257 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rc7nb\" (UniqueName: \"kubernetes.io/projected/9f957663-f73a-4111-98c3-58aca8336471-kube-api-access-rc7nb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" (UID: \"9f957663-f73a-4111-98c3-58aca8336471\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:22:55.752648 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.752628 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f957663-f73a-4111-98c3-58aca8336471-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" (UID: \"9f957663-f73a-4111-98c3-58aca8336471\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:22:55.761904 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.761882 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc7nb\" (UniqueName: \"kubernetes.io/projected/9f957663-f73a-4111-98c3-58aca8336471-kube-api-access-rc7nb\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" (UID: \"9f957663-f73a-4111-98c3-58aca8336471\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:22:55.830301 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.830209 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:22:55.965245 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.965212 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t"] Apr 17 21:22:55.966985 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:22:55.966952 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f957663_f73a_4111_98c3_58aca8336471.slice/crio-03854985c076875e59c2e02975df84c493fd0e2a6d13fefc43d2cb184d327383 WatchSource:0}: Error finding container 03854985c076875e59c2e02975df84c493fd0e2a6d13fefc43d2cb184d327383: Status 404 returned error can't find the container with id 03854985c076875e59c2e02975df84c493fd0e2a6d13fefc43d2cb184d327383 Apr 17 21:22:55.976651 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:22:55.976610 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" event={"ID":"9f957663-f73a-4111-98c3-58aca8336471","Type":"ContainerStarted","Data":"03854985c076875e59c2e02975df84c493fd0e2a6d13fefc43d2cb184d327383"} Apr 17 21:23:02.003107 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:02.003061 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" event={"ID":"9f957663-f73a-4111-98c3-58aca8336471","Type":"ContainerStarted","Data":"7cda7a3fbc2574129513c810584fc870950a1e730fb5b231fa1ef09154e465b7"} Apr 17 21:23:02.003777 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:02.003168 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:23:02.025505 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:02.025449 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" podStartSLOduration=1.933018341 podStartE2EDuration="7.025433418s" podCreationTimestamp="2026-04-17 21:22:55 +0000 UTC" firstStartedPulling="2026-04-17 21:22:55.969429977 +0000 UTC m=+531.538942708" lastFinishedPulling="2026-04-17 21:23:01.061845049 +0000 UTC m=+536.631357785" observedRunningTime="2026-04-17 21:23:02.023207263 +0000 UTC m=+537.592720043" watchObservedRunningTime="2026-04-17 21:23:02.025433418 +0000 UTC m=+537.594946172" Apr 17 21:23:02.220107 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:02.220064 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:23:02.220264 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:02.220120 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:23:02.225126 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:02.225097 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:23:03.012190 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:03.012163 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b9fdff79-95ddb" Apr 17 21:23:03.063692 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:03.063658 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c6df6579f-w4rxx"] Apr 17 21:23:03.968260 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:03.968228 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6"] Apr 17 21:23:03.972046 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:03.972024 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:03.974388 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:03.974366 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 21:23:03.974388 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:03.974383 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 21:23:03.974599 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:03.974384 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cdvnh\"" Apr 17 21:23:03.979209 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:03.979185 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6"] Apr 17 21:23:04.040268 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.040230 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/acc732b9-d67a-449d-9ad2-4013a3acc44f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.040740 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.040275 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/acc732b9-d67a-449d-9ad2-4013a3acc44f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.040740 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.040507 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rq5h\" (UniqueName: \"kubernetes.io/projected/acc732b9-d67a-449d-9ad2-4013a3acc44f-kube-api-access-9rq5h\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.141842 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.141808 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/acc732b9-d67a-449d-9ad2-4013a3acc44f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.141842 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.141845 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/acc732b9-d67a-449d-9ad2-4013a3acc44f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.142073 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.141893 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rq5h\" (UniqueName: \"kubernetes.io/projected/acc732b9-d67a-449d-9ad2-4013a3acc44f-kube-api-access-9rq5h\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.142481 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.142460 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/acc732b9-d67a-449d-9ad2-4013a3acc44f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.144260 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.144241 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/acc732b9-d67a-449d-9ad2-4013a3acc44f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.150309 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.150284 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rq5h\" (UniqueName: \"kubernetes.io/projected/acc732b9-d67a-449d-9ad2-4013a3acc44f-kube-api-access-9rq5h\") pod \"kuadrant-console-plugin-6cb54b5c86-xk9z6\" (UID: \"acc732b9-d67a-449d-9ad2-4013a3acc44f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.282907 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.282816 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" Apr 17 21:23:04.431660 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:04.431629 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6"] Apr 17 21:23:04.433037 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:23:04.433008 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacc732b9_d67a_449d_9ad2_4013a3acc44f.slice/crio-1f88efd46d8ff628059832f26cdfd6a43533dda57ae7a61218e385462b0de5fc WatchSource:0}: Error finding container 1f88efd46d8ff628059832f26cdfd6a43533dda57ae7a61218e385462b0de5fc: Status 404 returned error can't find the container with id 1f88efd46d8ff628059832f26cdfd6a43533dda57ae7a61218e385462b0de5fc Apr 17 21:23:05.016545 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:05.016486 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" event={"ID":"acc732b9-d67a-449d-9ad2-4013a3acc44f","Type":"ContainerStarted","Data":"1f88efd46d8ff628059832f26cdfd6a43533dda57ae7a61218e385462b0de5fc"} Apr 17 21:23:13.010759 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:13.010723 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:23:14.460634 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.460598 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc"] Apr 17 21:23:14.478957 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.478926 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc"] Apr 17 21:23:14.479129 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.479067 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:14.558787 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.558748 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/15710382-536a-4533-809b-c9461afdd663-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-wthdc\" (UID: \"15710382-536a-4533-809b-c9461afdd663\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:14.558979 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.558888 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwkf4\" (UniqueName: \"kubernetes.io/projected/15710382-536a-4533-809b-c9461afdd663-kube-api-access-lwkf4\") pod \"kuadrant-operator-controller-manager-84b657d985-wthdc\" (UID: \"15710382-536a-4533-809b-c9461afdd663\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:14.575858 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.575813 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc"] Apr 17 21:23:14.576783 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:23:14.576745 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-lwkf4], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" podUID="15710382-536a-4533-809b-c9461afdd663" Apr 17 21:23:14.616965 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.616926 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c"] Apr 17 21:23:14.624723 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.624687 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" Apr 17 21:23:14.633534 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.633488 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c"] Apr 17 21:23:14.645481 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.645449 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c"] Apr 17 21:23:14.645788 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:23:14.645765 2567 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-9md2g], unattached volumes=[], failed to process volumes=[extensions-socket-volume kube-api-access-9md2g]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" podUID="572c3a94-13ee-4532-ab06-16f1d4c7e5ea" Apr 17 21:23:14.655065 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.655040 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c"] Apr 17 21:23:14.660261 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.660227 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/15710382-536a-4533-809b-c9461afdd663-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-wthdc\" (UID: \"15710382-536a-4533-809b-c9461afdd663\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:14.660410 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.660392 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwkf4\" (UniqueName: \"kubernetes.io/projected/15710382-536a-4533-809b-c9461afdd663-kube-api-access-lwkf4\") pod \"kuadrant-operator-controller-manager-84b657d985-wthdc\" (UID: \"15710382-536a-4533-809b-c9461afdd663\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:14.660729 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.660704 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/15710382-536a-4533-809b-c9461afdd663-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-wthdc\" (UID: \"15710382-536a-4533-809b-c9461afdd663\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:14.666864 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.666838 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t"] Apr 17 21:23:14.667104 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.667077 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" podUID="9f957663-f73a-4111-98c3-58aca8336471" containerName="manager" containerID="cri-o://7cda7a3fbc2574129513c810584fc870950a1e730fb5b231fa1ef09154e465b7" gracePeriod=2 Apr 17 21:23:14.671539 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.671459 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k"] Apr 17 21:23:14.675883 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.675863 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:14.686079 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.685641 2567 status_manager.go:895] "Failed to get status for pod" podUID="9f957663-f73a-4111-98c3-58aca8336471" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:14.687599 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.687579 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t"] Apr 17 21:23:14.691373 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.691349 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k"] Apr 17 21:23:14.692043 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.692020 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwkf4\" (UniqueName: \"kubernetes.io/projected/15710382-536a-4533-809b-c9461afdd663-kube-api-access-lwkf4\") pod \"kuadrant-operator-controller-manager-84b657d985-wthdc\" (UID: \"15710382-536a-4533-809b-c9461afdd663\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:14.698313 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.698290 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9"] Apr 17 21:23:14.698838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.698815 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f957663-f73a-4111-98c3-58aca8336471" containerName="manager" Apr 17 21:23:14.698838 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.698838 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f957663-f73a-4111-98c3-58aca8336471" containerName="manager" Apr 17 21:23:14.699004 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.698943 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f957663-f73a-4111-98c3-58aca8336471" containerName="manager" Apr 17 21:23:14.702236 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.702219 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:14.707124 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.707103 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc"] Apr 17 21:23:14.717396 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.717318 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9"] Apr 17 21:23:14.721242 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.721213 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc"] Apr 17 21:23:14.745639 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.745599 2567 status_manager.go:895] "Failed to get status for pod" podUID="9f957663-f73a-4111-98c3-58aca8336471" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:14.761845 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.761815 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bb6df58b-633b-471b-a227-0bada50e5121-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z47n9\" (UID: \"bb6df58b-633b-471b-a227-0bada50e5121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:14.761998 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.761861 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxs5l\" (UniqueName: \"kubernetes.io/projected/bb6df58b-633b-471b-a227-0bada50e5121-kube-api-access-pxs5l\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z47n9\" (UID: \"bb6df58b-633b-471b-a227-0bada50e5121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:14.761998 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.761881 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/29037a2d-15a2-4a4c-a508-f6d7f105e4ab-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tpr4k\" (UID: \"29037a2d-15a2-4a4c-a508-f6d7f105e4ab\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:14.761998 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.761916 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbrv5\" (UniqueName: \"kubernetes.io/projected/29037a2d-15a2-4a4c-a508-f6d7f105e4ab-kube-api-access-jbrv5\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tpr4k\" (UID: \"29037a2d-15a2-4a4c-a508-f6d7f105e4ab\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:14.862803 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.862765 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bb6df58b-633b-471b-a227-0bada50e5121-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z47n9\" (UID: \"bb6df58b-633b-471b-a227-0bada50e5121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:14.863015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.862825 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxs5l\" (UniqueName: \"kubernetes.io/projected/bb6df58b-633b-471b-a227-0bada50e5121-kube-api-access-pxs5l\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z47n9\" (UID: \"bb6df58b-633b-471b-a227-0bada50e5121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:14.863015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.862853 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/29037a2d-15a2-4a4c-a508-f6d7f105e4ab-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tpr4k\" (UID: \"29037a2d-15a2-4a4c-a508-f6d7f105e4ab\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:14.863015 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.862874 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbrv5\" (UniqueName: \"kubernetes.io/projected/29037a2d-15a2-4a4c-a508-f6d7f105e4ab-kube-api-access-jbrv5\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tpr4k\" (UID: \"29037a2d-15a2-4a4c-a508-f6d7f105e4ab\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:14.863266 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.863239 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bb6df58b-633b-471b-a227-0bada50e5121-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z47n9\" (UID: \"bb6df58b-633b-471b-a227-0bada50e5121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:14.863354 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.863324 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/29037a2d-15a2-4a4c-a508-f6d7f105e4ab-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tpr4k\" (UID: \"29037a2d-15a2-4a4c-a508-f6d7f105e4ab\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:14.872418 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.872376 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxs5l\" (UniqueName: \"kubernetes.io/projected/bb6df58b-633b-471b-a227-0bada50e5121-kube-api-access-pxs5l\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-z47n9\" (UID: \"bb6df58b-633b-471b-a227-0bada50e5121\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:14.873671 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:14.873639 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbrv5\" (UniqueName: \"kubernetes.io/projected/29037a2d-15a2-4a4c-a508-f6d7f105e4ab-kube-api-access-jbrv5\") pod \"kuadrant-operator-controller-manager-55c7f4c975-tpr4k\" (UID: \"29037a2d-15a2-4a4c-a508-f6d7f105e4ab\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:15.048221 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.048123 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:15.057108 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.057071 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="572c3a94-13ee-4532-ab06-16f1d4c7e5ea" path="/var/lib/kubelet/pods/572c3a94-13ee-4532-ab06-16f1d4c7e5ea/volumes" Apr 17 21:23:15.057370 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.057345 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:15.058453 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.058419 2567 status_manager.go:895] "Failed to get status for pod" podUID="9f957663-f73a-4111-98c3-58aca8336471" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:15.066859 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.066835 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:15.067001 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.066835 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" Apr 17 21:23:15.074026 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.074001 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:15.077341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.077308 2567 status_manager.go:895] "Failed to get status for pod" podUID="15710382-536a-4533-809b-c9461afdd663" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" err="pods \"kuadrant-operator-controller-manager-84b657d985-wthdc\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:15.078292 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.078277 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" Apr 17 21:23:15.080590 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.080564 2567 status_manager.go:895] "Failed to get status for pod" podUID="15710382-536a-4533-809b-c9461afdd663" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" err="pods \"kuadrant-operator-controller-manager-84b657d985-wthdc\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:15.082485 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.082453 2567 status_manager.go:895] "Failed to get status for pod" podUID="572c3a94-13ee-4532-ab06-16f1d4c7e5ea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-ngl8c\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:15.164749 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.164716 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/15710382-536a-4533-809b-c9461afdd663-extensions-socket-volume\") pod \"15710382-536a-4533-809b-c9461afdd663\" (UID: \"15710382-536a-4533-809b-c9461afdd663\") " Apr 17 21:23:15.164938 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.164822 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwkf4\" (UniqueName: \"kubernetes.io/projected/15710382-536a-4533-809b-c9461afdd663-kube-api-access-lwkf4\") pod \"15710382-536a-4533-809b-c9461afdd663\" (UID: \"15710382-536a-4533-809b-c9461afdd663\") " Apr 17 21:23:15.165055 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.165027 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15710382-536a-4533-809b-c9461afdd663-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "15710382-536a-4533-809b-c9461afdd663" (UID: "15710382-536a-4533-809b-c9461afdd663"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:23:15.167095 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.167068 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15710382-536a-4533-809b-c9461afdd663-kube-api-access-lwkf4" (OuterVolumeSpecName: "kube-api-access-lwkf4") pod "15710382-536a-4533-809b-c9461afdd663" (UID: "15710382-536a-4533-809b-c9461afdd663"). InnerVolumeSpecName "kube-api-access-lwkf4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:23:15.265721 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.265679 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwkf4\" (UniqueName: \"kubernetes.io/projected/15710382-536a-4533-809b-c9461afdd663-kube-api-access-lwkf4\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:15.265721 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:15.265718 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/15710382-536a-4533-809b-c9461afdd663-extensions-socket-volume\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:16.071528 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.071486 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" Apr 17 21:23:16.071958 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.071481 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" Apr 17 21:23:16.074013 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.073982 2567 status_manager.go:895] "Failed to get status for pod" podUID="15710382-536a-4533-809b-c9461afdd663" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" err="pods \"kuadrant-operator-controller-manager-84b657d985-wthdc\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:16.075836 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.075813 2567 status_manager.go:895] "Failed to get status for pod" podUID="572c3a94-13ee-4532-ab06-16f1d4c7e5ea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-ngl8c\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:16.077619 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.077590 2567 status_manager.go:895] "Failed to get status for pod" podUID="15710382-536a-4533-809b-c9461afdd663" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" err="pods \"kuadrant-operator-controller-manager-84b657d985-wthdc\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:16.079427 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.079403 2567 status_manager.go:895] "Failed to get status for pod" podUID="572c3a94-13ee-4532-ab06-16f1d4c7e5ea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-ngl8c\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:16.082658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.082634 2567 status_manager.go:895] "Failed to get status for pod" podUID="15710382-536a-4533-809b-c9461afdd663" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" err="pods \"kuadrant-operator-controller-manager-84b657d985-wthdc\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:16.084498 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.084469 2567 status_manager.go:895] "Failed to get status for pod" podUID="572c3a94-13ee-4532-ab06-16f1d4c7e5ea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-ngl8c\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:16.086333 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.086310 2567 status_manager.go:895] "Failed to get status for pod" podUID="15710382-536a-4533-809b-c9461afdd663" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-wthdc" err="pods \"kuadrant-operator-controller-manager-84b657d985-wthdc\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:16.088075 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:16.088046 2567 status_manager.go:895] "Failed to get status for pod" podUID="572c3a94-13ee-4532-ab06-16f1d4c7e5ea" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-ngl8c" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-ngl8c\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:17.056066 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:17.056028 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15710382-536a-4533-809b-c9461afdd663" path="/var/lib/kubelet/pods/15710382-536a-4533-809b-c9461afdd663/volumes" Apr 17 21:23:27.123801 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.123690 2567 generic.go:358] "Generic (PLEG): container finished" podID="9f957663-f73a-4111-98c3-58aca8336471" containerID="7cda7a3fbc2574129513c810584fc870950a1e730fb5b231fa1ef09154e465b7" exitCode=0 Apr 17 21:23:27.161055 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.161031 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:23:27.163606 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.163571 2567 status_manager.go:895] "Failed to get status for pod" podUID="9f957663-f73a-4111-98c3-58aca8336471" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:27.192218 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.189991 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9"] Apr 17 21:23:27.192611 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:23:27.192557 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb6df58b_633b_471b_a227_0bada50e5121.slice/crio-4742d2ffe98cca51edc51e1eb78e70453a8b6e95b5c83e60365f4865464ec8ce WatchSource:0}: Error finding container 4742d2ffe98cca51edc51e1eb78e70453a8b6e95b5c83e60365f4865464ec8ce: Status 404 returned error can't find the container with id 4742d2ffe98cca51edc51e1eb78e70453a8b6e95b5c83e60365f4865464ec8ce Apr 17 21:23:27.290470 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.290385 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc7nb\" (UniqueName: \"kubernetes.io/projected/9f957663-f73a-4111-98c3-58aca8336471-kube-api-access-rc7nb\") pod \"9f957663-f73a-4111-98c3-58aca8336471\" (UID: \"9f957663-f73a-4111-98c3-58aca8336471\") " Apr 17 21:23:27.290470 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.290444 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f957663-f73a-4111-98c3-58aca8336471-extensions-socket-volume\") pod \"9f957663-f73a-4111-98c3-58aca8336471\" (UID: \"9f957663-f73a-4111-98c3-58aca8336471\") " Apr 17 21:23:27.290817 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.290794 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f957663-f73a-4111-98c3-58aca8336471-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "9f957663-f73a-4111-98c3-58aca8336471" (UID: "9f957663-f73a-4111-98c3-58aca8336471"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:23:27.292401 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.292379 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f957663-f73a-4111-98c3-58aca8336471-kube-api-access-rc7nb" (OuterVolumeSpecName: "kube-api-access-rc7nb") pod "9f957663-f73a-4111-98c3-58aca8336471" (UID: "9f957663-f73a-4111-98c3-58aca8336471"). InnerVolumeSpecName "kube-api-access-rc7nb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:23:27.392420 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.392380 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rc7nb\" (UniqueName: \"kubernetes.io/projected/9f957663-f73a-4111-98c3-58aca8336471-kube-api-access-rc7nb\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:27.392420 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.392424 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f957663-f73a-4111-98c3-58aca8336471-extensions-socket-volume\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:27.421587 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:27.421559 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k"] Apr 17 21:23:27.423423 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:23:27.423396 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29037a2d_15a2_4a4c_a508_f6d7f105e4ab.slice/crio-4d134d96836b1b3315b6714a3a9339fa48a0989db0f00cd83cc259ab904e9695 WatchSource:0}: Error finding container 4d134d96836b1b3315b6714a3a9339fa48a0989db0f00cd83cc259ab904e9695: Status 404 returned error can't find the container with id 4d134d96836b1b3315b6714a3a9339fa48a0989db0f00cd83cc259ab904e9695 Apr 17 21:23:28.087234 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.087195 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c6df6579f-w4rxx" podUID="e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" containerName="console" containerID="cri-o://09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995" gracePeriod=15 Apr 17 21:23:28.129089 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.129054 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" event={"ID":"29037a2d-15a2-4a4c-a508-f6d7f105e4ab","Type":"ContainerStarted","Data":"88d82beba2e3700cd980167875dbee6153d68649f4419d81bc300a0bfab76161"} Apr 17 21:23:28.129544 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.129095 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" event={"ID":"29037a2d-15a2-4a4c-a508-f6d7f105e4ab","Type":"ContainerStarted","Data":"4d134d96836b1b3315b6714a3a9339fa48a0989db0f00cd83cc259ab904e9695"} Apr 17 21:23:28.129544 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.129203 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:28.130581 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.130540 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" event={"ID":"bb6df58b-633b-471b-a227-0bada50e5121","Type":"ContainerStarted","Data":"cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f"} Apr 17 21:23:28.130581 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.130568 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" event={"ID":"bb6df58b-633b-471b-a227-0bada50e5121","Type":"ContainerStarted","Data":"4742d2ffe98cca51edc51e1eb78e70453a8b6e95b5c83e60365f4865464ec8ce"} Apr 17 21:23:28.130757 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.130654 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:28.131428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.131400 2567 status_manager.go:895] "Failed to get status for pod" podUID="9f957663-f73a-4111-98c3-58aca8336471" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:28.131917 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.131892 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" event={"ID":"acc732b9-d67a-449d-9ad2-4013a3acc44f","Type":"ContainerStarted","Data":"d55c84a7e67c615e21a0c5021e562b53684eddc7bda3b756ebd90e4099ff209e"} Apr 17 21:23:28.133332 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.133312 2567 scope.go:117] "RemoveContainer" containerID="7cda7a3fbc2574129513c810584fc870950a1e730fb5b231fa1ef09154e465b7" Apr 17 21:23:28.133332 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.133323 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" Apr 17 21:23:28.149980 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.149931 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" podStartSLOduration=14.149913223 podStartE2EDuration="14.149913223s" podCreationTimestamp="2026-04-17 21:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:23:28.147368227 +0000 UTC m=+563.716880986" watchObservedRunningTime="2026-04-17 21:23:28.149913223 +0000 UTC m=+563.719425978" Apr 17 21:23:28.167985 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.167922 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" podStartSLOduration=14.167904085 podStartE2EDuration="14.167904085s" podCreationTimestamp="2026-04-17 21:23:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:23:28.165974358 +0000 UTC m=+563.735487151" watchObservedRunningTime="2026-04-17 21:23:28.167904085 +0000 UTC m=+563.737416839" Apr 17 21:23:28.168143 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.168089 2567 status_manager.go:895] "Failed to get status for pod" podUID="9f957663-f73a-4111-98c3-58aca8336471" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:28.189228 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.189179 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-xk9z6" podStartSLOduration=2.517673375 podStartE2EDuration="25.189161789s" podCreationTimestamp="2026-04-17 21:23:03 +0000 UTC" firstStartedPulling="2026-04-17 21:23:04.434304024 +0000 UTC m=+540.003816760" lastFinishedPulling="2026-04-17 21:23:27.105792441 +0000 UTC m=+562.675305174" observedRunningTime="2026-04-17 21:23:28.188270479 +0000 UTC m=+563.757783255" watchObservedRunningTime="2026-04-17 21:23:28.189161789 +0000 UTC m=+563.758674545" Apr 17 21:23:28.329764 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.329737 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6df6579f-w4rxx_e0f84fcf-5e32-4376-9ad9-4f5391a53cbf/console/0.log" Apr 17 21:23:28.329885 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.329799 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:23:28.350885 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.350842 2567 status_manager.go:895] "Failed to get status for pod" podUID="9f957663-f73a-4111-98c3-58aca8336471" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-wlr7t" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-wlr7t\" is forbidden: User \"system:node:ip-10-0-134-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-198.ec2.internal' and this object" Apr 17 21:23:28.504600 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.504563 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gktln\" (UniqueName: \"kubernetes.io/projected/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-kube-api-access-gktln\") pod \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " Apr 17 21:23:28.504600 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.504601 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-serving-cert\") pod \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " Apr 17 21:23:28.504833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.504635 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-config\") pod \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " Apr 17 21:23:28.504833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.504702 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-service-ca\") pod \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " Apr 17 21:23:28.504833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.504752 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-oauth-config\") pod \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " Apr 17 21:23:28.504833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.504777 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-oauth-serving-cert\") pod \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " Apr 17 21:23:28.504833 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.504802 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-trusted-ca-bundle\") pod \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\" (UID: \"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf\") " Apr 17 21:23:28.505232 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.505201 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-config" (OuterVolumeSpecName: "console-config") pod "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" (UID: "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:23:28.505331 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.505266 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-service-ca" (OuterVolumeSpecName: "service-ca") pod "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" (UID: "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:23:28.505331 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.505290 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" (UID: "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:23:28.505460 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.505357 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" (UID: "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:23:28.506920 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.506887 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-kube-api-access-gktln" (OuterVolumeSpecName: "kube-api-access-gktln") pod "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" (UID: "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf"). InnerVolumeSpecName "kube-api-access-gktln". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:23:28.507479 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.507447 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" (UID: "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:23:28.507615 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.507549 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" (UID: "e0f84fcf-5e32-4376-9ad9-4f5391a53cbf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:23:28.605903 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.605864 2567 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-oauth-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:28.605903 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.605903 2567 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-oauth-serving-cert\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:28.606160 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.605917 2567 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-trusted-ca-bundle\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:28.606160 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.605933 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gktln\" (UniqueName: \"kubernetes.io/projected/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-kube-api-access-gktln\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:28.606160 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.605959 2567 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-serving-cert\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:28.606160 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.605975 2567 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-console-config\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:28.606160 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:28.605990 2567 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf-service-ca\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:29.055658 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.055568 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f957663-f73a-4111-98c3-58aca8336471" path="/var/lib/kubelet/pods/9f957663-f73a-4111-98c3-58aca8336471/volumes" Apr 17 21:23:29.138997 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.138972 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c6df6579f-w4rxx_e0f84fcf-5e32-4376-9ad9-4f5391a53cbf/console/0.log" Apr 17 21:23:29.139397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.139011 2567 generic.go:358] "Generic (PLEG): container finished" podID="e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" containerID="09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995" exitCode=2 Apr 17 21:23:29.139397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.139089 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c6df6579f-w4rxx" Apr 17 21:23:29.139397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.139120 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6df6579f-w4rxx" event={"ID":"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf","Type":"ContainerDied","Data":"09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995"} Apr 17 21:23:29.139397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.139161 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c6df6579f-w4rxx" event={"ID":"e0f84fcf-5e32-4376-9ad9-4f5391a53cbf","Type":"ContainerDied","Data":"8d684c36afeed793ba5d581641b030554d006ea1b8951435ccee2e94712d02b9"} Apr 17 21:23:29.139397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.139178 2567 scope.go:117] "RemoveContainer" containerID="09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995" Apr 17 21:23:29.148014 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.147995 2567 scope.go:117] "RemoveContainer" containerID="09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995" Apr 17 21:23:29.148268 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:23:29.148244 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995\": container with ID starting with 09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995 not found: ID does not exist" containerID="09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995" Apr 17 21:23:29.148371 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.148279 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995"} err="failed to get container status \"09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995\": rpc error: code = NotFound desc = could not find container \"09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995\": container with ID starting with 09a69b6ac4d33555f5481b5326b3b521b8ac963b8abd55feb4b4472d58526995 not found: ID does not exist" Apr 17 21:23:29.158014 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.157985 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c6df6579f-w4rxx"] Apr 17 21:23:29.161456 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:29.161431 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c6df6579f-w4rxx"] Apr 17 21:23:31.056134 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:31.056101 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" path="/var/lib/kubelet/pods/e0f84fcf-5e32-4376-9ad9-4f5391a53cbf/volumes" Apr 17 21:23:39.142002 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.141964 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-tpr4k" Apr 17 21:23:39.142534 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.142032 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:39.196135 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.196103 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9"] Apr 17 21:23:39.196440 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.196389 2567 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" podUID="bb6df58b-633b-471b-a227-0bada50e5121" containerName="manager" containerID="cri-o://cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f" gracePeriod=10 Apr 17 21:23:39.486296 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.486271 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:39.497090 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.497062 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bb6df58b-633b-471b-a227-0bada50e5121-extensions-socket-volume\") pod \"bb6df58b-633b-471b-a227-0bada50e5121\" (UID: \"bb6df58b-633b-471b-a227-0bada50e5121\") " Apr 17 21:23:39.497261 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.497127 2567 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxs5l\" (UniqueName: \"kubernetes.io/projected/bb6df58b-633b-471b-a227-0bada50e5121-kube-api-access-pxs5l\") pod \"bb6df58b-633b-471b-a227-0bada50e5121\" (UID: \"bb6df58b-633b-471b-a227-0bada50e5121\") " Apr 17 21:23:39.497497 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.497471 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb6df58b-633b-471b-a227-0bada50e5121-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "bb6df58b-633b-471b-a227-0bada50e5121" (UID: "bb6df58b-633b-471b-a227-0bada50e5121"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:23:39.499231 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.499208 2567 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6df58b-633b-471b-a227-0bada50e5121-kube-api-access-pxs5l" (OuterVolumeSpecName: "kube-api-access-pxs5l") pod "bb6df58b-633b-471b-a227-0bada50e5121" (UID: "bb6df58b-633b-471b-a227-0bada50e5121"). InnerVolumeSpecName "kube-api-access-pxs5l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:23:39.598556 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.598506 2567 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxs5l\" (UniqueName: \"kubernetes.io/projected/bb6df58b-633b-471b-a227-0bada50e5121-kube-api-access-pxs5l\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:39.598556 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:39.598553 2567 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bb6df58b-633b-471b-a227-0bada50e5121-extensions-socket-volume\") on node \"ip-10-0-134-198.ec2.internal\" DevicePath \"\"" Apr 17 21:23:40.185144 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.185106 2567 generic.go:358] "Generic (PLEG): container finished" podID="bb6df58b-633b-471b-a227-0bada50e5121" containerID="cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f" exitCode=0 Apr 17 21:23:40.185633 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.185160 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" event={"ID":"bb6df58b-633b-471b-a227-0bada50e5121","Type":"ContainerDied","Data":"cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f"} Apr 17 21:23:40.185633 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.185171 2567 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" Apr 17 21:23:40.185633 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.185194 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9" event={"ID":"bb6df58b-633b-471b-a227-0bada50e5121","Type":"ContainerDied","Data":"4742d2ffe98cca51edc51e1eb78e70453a8b6e95b5c83e60365f4865464ec8ce"} Apr 17 21:23:40.185633 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.185213 2567 scope.go:117] "RemoveContainer" containerID="cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f" Apr 17 21:23:40.195247 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.195226 2567 scope.go:117] "RemoveContainer" containerID="cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f" Apr 17 21:23:40.195544 ip-10-0-134-198 kubenswrapper[2567]: E0417 21:23:40.195503 2567 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f\": container with ID starting with cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f not found: ID does not exist" containerID="cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f" Apr 17 21:23:40.195614 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.195556 2567 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f"} err="failed to get container status \"cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f\": rpc error: code = NotFound desc = could not find container \"cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f\": container with ID starting with cf89da3730ad0328d37465213fe1a04fc1e916498faa6b7cc6bcaa5b113c4f4f not found: ID does not exist" Apr 17 21:23:40.208451 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.208418 2567 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9"] Apr 17 21:23:40.221469 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:40.221433 2567 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-z47n9"] Apr 17 21:23:41.056105 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:41.056073 2567 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6df58b-633b-471b-a227-0bada50e5121" path="/var/lib/kubelet/pods/bb6df58b-633b-471b-a227-0bada50e5121/volumes" Apr 17 21:23:55.379509 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.379473 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9"] Apr 17 21:23:55.379982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.379942 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" containerName="console" Apr 17 21:23:55.379982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.379956 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" containerName="console" Apr 17 21:23:55.379982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.379966 2567 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb6df58b-633b-471b-a227-0bada50e5121" containerName="manager" Apr 17 21:23:55.379982 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.379972 2567 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6df58b-633b-471b-a227-0bada50e5121" containerName="manager" Apr 17 21:23:55.380124 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.380033 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0f84fcf-5e32-4376-9ad9-4f5391a53cbf" containerName="console" Apr 17 21:23:55.380124 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.380046 2567 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb6df58b-633b-471b-a227-0bada50e5121" containerName="manager" Apr 17 21:23:55.385374 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.385348 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.388387 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.388361 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-kq48x\"" Apr 17 21:23:55.393699 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.393666 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9"] Apr 17 21:23:55.442033 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.441997 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.442325 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.442056 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.442325 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.442124 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8qd8\" (UniqueName: \"kubernetes.io/projected/03621f00-c1ff-4311-870e-48851bec8851-kube-api-access-c8qd8\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.442325 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.442215 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.442325 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.442270 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/03621f00-c1ff-4311-870e-48851bec8851-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.442665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.442330 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.442665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.442351 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/03621f00-c1ff-4311-870e-48851bec8851-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.442665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.442448 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/03621f00-c1ff-4311-870e-48851bec8851-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.442665 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.442485 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543454 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543416 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543643 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543488 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/03621f00-c1ff-4311-870e-48851bec8851-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543643 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543554 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/03621f00-c1ff-4311-870e-48851bec8851-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543643 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543591 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543643 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543614 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543656 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543685 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8qd8\" (UniqueName: \"kubernetes.io/projected/03621f00-c1ff-4311-870e-48851bec8851-kube-api-access-c8qd8\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543688 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543768 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.543834 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.543811 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/03621f00-c1ff-4311-870e-48851bec8851-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.544110 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.544025 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.544110 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.544042 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.544197 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.544109 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.544377 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.544341 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/03621f00-c1ff-4311-870e-48851bec8851-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.546023 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.545998 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/03621f00-c1ff-4311-870e-48851bec8851-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.546265 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.546247 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/03621f00-c1ff-4311-870e-48851bec8851-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.551419 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.551396 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/03621f00-c1ff-4311-870e-48851bec8851-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.551609 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.551590 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8qd8\" (UniqueName: \"kubernetes.io/projected/03621f00-c1ff-4311-870e-48851bec8851-kube-api-access-c8qd8\") pod \"maas-default-gateway-openshift-default-58b6f876-2fdt9\" (UID: \"03621f00-c1ff-4311-870e-48851bec8851\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.699026 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.698932 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:55.826114 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.826081 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9"] Apr 17 21:23:55.827650 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:23:55.827620 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03621f00_c1ff_4311_870e_48851bec8851.slice/crio-4768ab2814abe51207fdc97a89739e67cbfedfcb07323fef81f97641266ea6e0 WatchSource:0}: Error finding container 4768ab2814abe51207fdc97a89739e67cbfedfcb07323fef81f97641266ea6e0: Status 404 returned error can't find the container with id 4768ab2814abe51207fdc97a89739e67cbfedfcb07323fef81f97641266ea6e0 Apr 17 21:23:55.829822 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.829793 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:23:55.829916 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.829852 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:23:55.829916 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:55.829880 2567 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:23:56.248877 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:56.248825 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" event={"ID":"03621f00-c1ff-4311-870e-48851bec8851","Type":"ContainerStarted","Data":"8ea5960c5096209467270ccf696249780088233b4a657df71d337e7e90bf3ac6"} Apr 17 21:23:56.248877 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:56.248873 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" event={"ID":"03621f00-c1ff-4311-870e-48851bec8851","Type":"ContainerStarted","Data":"4768ab2814abe51207fdc97a89739e67cbfedfcb07323fef81f97641266ea6e0"} Apr 17 21:23:56.267946 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:56.267897 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" podStartSLOduration=1.267882118 podStartE2EDuration="1.267882118s" podCreationTimestamp="2026-04-17 21:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:23:56.265104722 +0000 UTC m=+591.834617474" watchObservedRunningTime="2026-04-17 21:23:56.267882118 +0000 UTC m=+591.837394871" Apr 17 21:23:56.699188 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:56.699146 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:56.704443 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:56.704414 2567 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:57.252786 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:57.252753 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:57.253878 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:57.253855 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-2fdt9" Apr 17 21:23:59.438514 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.438480 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:23:59.441018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.441000 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:23:59.443262 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.443240 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 21:23:59.450487 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.450463 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:23:59.472828 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.472796 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:23:59.580392 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.580359 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/72e0799e-e01f-4ce4-a98a-f3cdd864a0bd-config-file\") pod \"limitador-limitador-78c99df468-pdljg\" (UID: \"72e0799e-e01f-4ce4-a98a-f3cdd864a0bd\") " pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:23:59.580582 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.580405 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnw6\" (UniqueName: \"kubernetes.io/projected/72e0799e-e01f-4ce4-a98a-f3cdd864a0bd-kube-api-access-nfnw6\") pod \"limitador-limitador-78c99df468-pdljg\" (UID: \"72e0799e-e01f-4ce4-a98a-f3cdd864a0bd\") " pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:23:59.681329 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.681294 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnw6\" (UniqueName: \"kubernetes.io/projected/72e0799e-e01f-4ce4-a98a-f3cdd864a0bd-kube-api-access-nfnw6\") pod \"limitador-limitador-78c99df468-pdljg\" (UID: \"72e0799e-e01f-4ce4-a98a-f3cdd864a0bd\") " pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:23:59.681537 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.681400 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/72e0799e-e01f-4ce4-a98a-f3cdd864a0bd-config-file\") pod \"limitador-limitador-78c99df468-pdljg\" (UID: \"72e0799e-e01f-4ce4-a98a-f3cdd864a0bd\") " pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:23:59.682059 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.682036 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/72e0799e-e01f-4ce4-a98a-f3cdd864a0bd-config-file\") pod \"limitador-limitador-78c99df468-pdljg\" (UID: \"72e0799e-e01f-4ce4-a98a-f3cdd864a0bd\") " pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:23:59.689502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.689435 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnw6\" (UniqueName: \"kubernetes.io/projected/72e0799e-e01f-4ce4-a98a-f3cdd864a0bd-kube-api-access-nfnw6\") pod \"limitador-limitador-78c99df468-pdljg\" (UID: \"72e0799e-e01f-4ce4-a98a-f3cdd864a0bd\") " pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:23:59.753538 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.753468 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:23:59.876662 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:23:59.876635 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:23:59.877731 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:23:59.877702 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72e0799e_e01f_4ce4_a98a_f3cdd864a0bd.slice/crio-c7688be3abc335fe302ed480152c32a0bd450f5923068d132b6f40c0aafdee25 WatchSource:0}: Error finding container c7688be3abc335fe302ed480152c32a0bd450f5923068d132b6f40c0aafdee25: Status 404 returned error can't find the container with id c7688be3abc335fe302ed480152c32a0bd450f5923068d132b6f40c0aafdee25 Apr 17 21:24:00.266041 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:00.266007 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" event={"ID":"72e0799e-e01f-4ce4-a98a-f3cdd864a0bd","Type":"ContainerStarted","Data":"c7688be3abc335fe302ed480152c32a0bd450f5923068d132b6f40c0aafdee25"} Apr 17 21:24:03.281007 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:03.280958 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" event={"ID":"72e0799e-e01f-4ce4-a98a-f3cdd864a0bd","Type":"ContainerStarted","Data":"e1f63a7dc9ecb97e266eb2b57b9dd23d2e65906de9054c918ee8cb4b3e28a726"} Apr 17 21:24:03.281007 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:03.281011 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:24:03.299339 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:03.299289 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" podStartSLOduration=1.674373422 podStartE2EDuration="4.299274888s" podCreationTimestamp="2026-04-17 21:23:59 +0000 UTC" firstStartedPulling="2026-04-17 21:23:59.87958566 +0000 UTC m=+595.449098392" lastFinishedPulling="2026-04-17 21:24:02.504487127 +0000 UTC m=+598.073999858" observedRunningTime="2026-04-17 21:24:03.297358081 +0000 UTC m=+598.866870846" watchObservedRunningTime="2026-04-17 21:24:03.299274888 +0000 UTC m=+598.868787641" Apr 17 21:24:04.975839 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:04.975804 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:24:04.976381 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:04.976358 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:24:04.979647 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:04.979622 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:24:04.980216 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:04.980189 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:24:14.286577 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:14.286514 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-pdljg" Apr 17 21:24:59.727770 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:24:59.727733 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:25:14.638660 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:14.638623 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:25:18.619286 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:18.619246 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:25:23.524384 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:23.524349 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:25:26.627990 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:26.627948 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:25:41.130851 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.130813 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:25:41.464231 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.464141 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx"] Apr 17 21:25:41.468735 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.468709 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.472312 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.472288 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 21:25:41.472860 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.472834 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 17 21:25:41.473237 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.473220 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 21:25:41.473584 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.473561 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-vmdkh\"" Apr 17 21:25:41.476032 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.476008 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx"] Apr 17 21:25:41.629912 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.629857 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.630136 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.630021 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.630136 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.630072 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.630136 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.630097 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv9mh\" (UniqueName: \"kubernetes.io/projected/7c6ecdea-ad8d-4531-876c-519749106f1e-kube-api-access-kv9mh\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.630303 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.630277 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.630341 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.630322 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6ecdea-ad8d-4531-876c-519749106f1e-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731404 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731303 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731404 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731371 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6ecdea-ad8d-4531-876c-519749106f1e-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731662 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731417 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731662 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731453 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731662 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731496 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731662 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731555 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv9mh\" (UniqueName: \"kubernetes.io/projected/7c6ecdea-ad8d-4531-876c-519749106f1e-kube-api-access-kv9mh\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731879 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731780 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731879 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731801 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.731961 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.731943 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.733832 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.733802 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7c6ecdea-ad8d-4531-876c-519749106f1e-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.734127 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.734107 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7c6ecdea-ad8d-4531-876c-519749106f1e-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.739181 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.739156 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv9mh\" (UniqueName: \"kubernetes.io/projected/7c6ecdea-ad8d-4531-876c-519749106f1e-kube-api-access-kv9mh\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-rcwnx\" (UID: \"7c6ecdea-ad8d-4531-876c-519749106f1e\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.782226 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.782181 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:41.910062 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.910034 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx"] Apr 17 21:25:41.911740 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:25:41.911714 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6ecdea_ad8d_4531_876c_519749106f1e.slice/crio-2130c2d61e32f639107f53ea7a326362e426fb67eaa90c9c999519a24a568b0e WatchSource:0}: Error finding container 2130c2d61e32f639107f53ea7a326362e426fb67eaa90c9c999519a24a568b0e: Status 404 returned error can't find the container with id 2130c2d61e32f639107f53ea7a326362e426fb67eaa90c9c999519a24a568b0e Apr 17 21:25:41.913813 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:41.913798 2567 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:25:42.694816 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:42.694779 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" event={"ID":"7c6ecdea-ad8d-4531-876c-519749106f1e","Type":"ContainerStarted","Data":"2130c2d61e32f639107f53ea7a326362e426fb67eaa90c9c999519a24a568b0e"} Apr 17 21:25:46.326083 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:46.326042 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:25:47.719963 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:47.719930 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" event={"ID":"7c6ecdea-ad8d-4531-876c-519749106f1e","Type":"ContainerStarted","Data":"80d746830324364576a1846782fe60da47fc6ee7171989628a10f772022f3889"} Apr 17 21:25:52.740602 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:52.740569 2567 generic.go:358] "Generic (PLEG): container finished" podID="7c6ecdea-ad8d-4531-876c-519749106f1e" containerID="80d746830324364576a1846782fe60da47fc6ee7171989628a10f772022f3889" exitCode=0 Apr 17 21:25:52.740981 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:52.740634 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" event={"ID":"7c6ecdea-ad8d-4531-876c-519749106f1e","Type":"ContainerDied","Data":"80d746830324364576a1846782fe60da47fc6ee7171989628a10f772022f3889"} Apr 17 21:25:56.761512 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:56.761416 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" event={"ID":"7c6ecdea-ad8d-4531-876c-519749106f1e","Type":"ContainerStarted","Data":"3ebe1d736104c6c3d18e00d82c20c20740a4450d2c15be7e94fac6b8f8463db3"} Apr 17 21:25:56.761944 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:56.761689 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:25:56.781916 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:25:56.781862 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" podStartSLOduration=1.288474026 podStartE2EDuration="15.781848681s" podCreationTimestamp="2026-04-17 21:25:41 +0000 UTC" firstStartedPulling="2026-04-17 21:25:41.913920955 +0000 UTC m=+697.483433687" lastFinishedPulling="2026-04-17 21:25:56.407295611 +0000 UTC m=+711.976808342" observedRunningTime="2026-04-17 21:25:56.778064143 +0000 UTC m=+712.347576899" watchObservedRunningTime="2026-04-17 21:25:56.781848681 +0000 UTC m=+712.351361433" Apr 17 21:26:07.778127 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:26:07.778090 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-rcwnx" Apr 17 21:26:44.228941 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:26:44.228904 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pdljg"] Apr 17 21:29:05.011585 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:29:05.011489 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:29:05.015128 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:29:05.015103 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:29:05.015307 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:29:05.015289 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:29:05.018352 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:29:05.018332 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:30:15.782397 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:15.782366 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-694fdf7c65-95kfc_463b17e4-2b3e-46d5-affc-05862505d3ba/manager/0.log" Apr 17 21:30:16.912110 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:16.912075 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6_254ffd77-476c-48c4-88ff-2f89b0847e9c/util/0.log" Apr 17 21:30:16.918249 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:16.918221 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6_254ffd77-476c-48c4-88ff-2f89b0847e9c/pull/0.log" Apr 17 21:30:16.924009 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:16.923983 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6_254ffd77-476c-48c4-88ff-2f89b0847e9c/extract/0.log" Apr 17 21:30:17.031917 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.031890 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v_4b07c673-6251-43f9-9720-fcb0e598d9b6/util/0.log" Apr 17 21:30:17.037973 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.037947 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v_4b07c673-6251-43f9-9720-fcb0e598d9b6/pull/0.log" Apr 17 21:30:17.043609 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.043585 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v_4b07c673-6251-43f9-9720-fcb0e598d9b6/extract/0.log" Apr 17 21:30:17.173022 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.172922 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq_c2358ee3-70da-40bd-a51a-ccce72501936/util/0.log" Apr 17 21:30:17.179085 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.179058 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq_c2358ee3-70da-40bd-a51a-ccce72501936/pull/0.log" Apr 17 21:30:17.184805 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.184776 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq_c2358ee3-70da-40bd-a51a-ccce72501936/extract/0.log" Apr 17 21:30:17.296832 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.296801 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8_1f396d01-3669-4f17-bdd9-3cd350725333/extract/0.log" Apr 17 21:30:17.302394 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.302373 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8_1f396d01-3669-4f17-bdd9-3cd350725333/util/0.log" Apr 17 21:30:17.308408 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.308392 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8_1f396d01-3669-4f17-bdd9-3cd350725333/pull/0.log" Apr 17 21:30:17.767187 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.767161 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-xk9z6_acc732b9-d67a-449d-9ad2-4013a3acc44f/kuadrant-console-plugin/0.log" Apr 17 21:30:17.882756 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:17.882706 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-hx8gd_0374bd95-cb75-4964-be0a-c883b7f390e7/registry-server/0.log" Apr 17 21:30:18.017576 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:18.017456 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-tpr4k_29037a2d-15a2-4a4c-a508-f6d7f105e4ab/manager/0.log" Apr 17 21:30:18.129801 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:18.129768 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-pdljg_72e0799e-e01f-4ce4-a98a-f3cdd864a0bd/limitador/0.log" Apr 17 21:30:18.588133 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:18.588098 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr_438ec947-206d-49a2-af2a-f80a0d472353/istio-proxy/0.log" Apr 17 21:30:19.047130 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:19.047100 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-2fdt9_03621f00-c1ff-4311-870e-48851bec8851/istio-proxy/0.log" Apr 17 21:30:19.809894 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:19.809831 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-rcwnx_7c6ecdea-ad8d-4531-876c-519749106f1e/storage-initializer/0.log" Apr 17 21:30:19.816963 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:19.816909 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-rcwnx_7c6ecdea-ad8d-4531-876c-519749106f1e/main/0.log" Apr 17 21:30:23.760682 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.760604 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w64ck/must-gather-cmjx5"] Apr 17 21:30:23.764419 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.764376 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w64ck/must-gather-cmjx5" Apr 17 21:30:23.768218 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.768126 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w64ck\"/\"kube-root-ca.crt\"" Apr 17 21:30:23.768399 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.768156 2567 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-w64ck\"/\"default-dockercfg-fwnkt\"" Apr 17 21:30:23.768475 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.768221 2567 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-w64ck\"/\"openshift-service-ca.crt\"" Apr 17 21:30:23.770370 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.770347 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w64ck/must-gather-cmjx5"] Apr 17 21:30:23.874321 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.874271 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s4tr\" (UniqueName: \"kubernetes.io/projected/17c43bbb-9dca-4e3d-9ff4-f321abbd379a-kube-api-access-2s4tr\") pod \"must-gather-cmjx5\" (UID: \"17c43bbb-9dca-4e3d-9ff4-f321abbd379a\") " pod="openshift-must-gather-w64ck/must-gather-cmjx5" Apr 17 21:30:23.874502 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.874412 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17c43bbb-9dca-4e3d-9ff4-f321abbd379a-must-gather-output\") pod \"must-gather-cmjx5\" (UID: \"17c43bbb-9dca-4e3d-9ff4-f321abbd379a\") " pod="openshift-must-gather-w64ck/must-gather-cmjx5" Apr 17 21:30:23.975933 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.975889 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2s4tr\" (UniqueName: \"kubernetes.io/projected/17c43bbb-9dca-4e3d-9ff4-f321abbd379a-kube-api-access-2s4tr\") pod \"must-gather-cmjx5\" (UID: \"17c43bbb-9dca-4e3d-9ff4-f321abbd379a\") " pod="openshift-must-gather-w64ck/must-gather-cmjx5" Apr 17 21:30:23.976133 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.976007 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17c43bbb-9dca-4e3d-9ff4-f321abbd379a-must-gather-output\") pod \"must-gather-cmjx5\" (UID: \"17c43bbb-9dca-4e3d-9ff4-f321abbd379a\") " pod="openshift-must-gather-w64ck/must-gather-cmjx5" Apr 17 21:30:23.976328 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.976305 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/17c43bbb-9dca-4e3d-9ff4-f321abbd379a-must-gather-output\") pod \"must-gather-cmjx5\" (UID: \"17c43bbb-9dca-4e3d-9ff4-f321abbd379a\") " pod="openshift-must-gather-w64ck/must-gather-cmjx5" Apr 17 21:30:23.991288 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:23.991255 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s4tr\" (UniqueName: \"kubernetes.io/projected/17c43bbb-9dca-4e3d-9ff4-f321abbd379a-kube-api-access-2s4tr\") pod \"must-gather-cmjx5\" (UID: \"17c43bbb-9dca-4e3d-9ff4-f321abbd379a\") " pod="openshift-must-gather-w64ck/must-gather-cmjx5" Apr 17 21:30:24.074601 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:24.074495 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w64ck/must-gather-cmjx5" Apr 17 21:30:24.212096 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:24.212064 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w64ck/must-gather-cmjx5"] Apr 17 21:30:24.214798 ip-10-0-134-198 kubenswrapper[2567]: W0417 21:30:24.214762 2567 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17c43bbb_9dca_4e3d_9ff4_f321abbd379a.slice/crio-85c8eac23d6923b253382b9f27bc60f33e3572f10b9ff99b44e06e6622c247fd WatchSource:0}: Error finding container 85c8eac23d6923b253382b9f27bc60f33e3572f10b9ff99b44e06e6622c247fd: Status 404 returned error can't find the container with id 85c8eac23d6923b253382b9f27bc60f33e3572f10b9ff99b44e06e6622c247fd Apr 17 21:30:24.827198 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:24.827159 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/must-gather-cmjx5" event={"ID":"17c43bbb-9dca-4e3d-9ff4-f321abbd379a","Type":"ContainerStarted","Data":"85c8eac23d6923b253382b9f27bc60f33e3572f10b9ff99b44e06e6622c247fd"} Apr 17 21:30:25.835913 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:25.835867 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/must-gather-cmjx5" event={"ID":"17c43bbb-9dca-4e3d-9ff4-f321abbd379a","Type":"ContainerStarted","Data":"65af9fe7060fb25a4ef9c3d7faad8da20b955b1bf3124501786519e65a213851"} Apr 17 21:30:25.836386 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:25.835932 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/must-gather-cmjx5" event={"ID":"17c43bbb-9dca-4e3d-9ff4-f321abbd379a","Type":"ContainerStarted","Data":"23d3092774ca1db54c50c219295cc41862527b6127ab0bf3e5838ef718875bb7"} Apr 17 21:30:25.853406 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:25.853331 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w64ck/must-gather-cmjx5" podStartSLOduration=2.033496841 podStartE2EDuration="2.853309938s" podCreationTimestamp="2026-04-17 21:30:23 +0000 UTC" firstStartedPulling="2026-04-17 21:30:24.216631045 +0000 UTC m=+979.786143777" lastFinishedPulling="2026-04-17 21:30:25.036444143 +0000 UTC m=+980.605956874" observedRunningTime="2026-04-17 21:30:25.851114586 +0000 UTC m=+981.420627340" watchObservedRunningTime="2026-04-17 21:30:25.853309938 +0000 UTC m=+981.422822693" Apr 17 21:30:26.630763 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:26.630735 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dsmfz_b51bcd52-26f7-423d-a48a-7a9ea687c5be/global-pull-secret-syncer/0.log" Apr 17 21:30:26.784710 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:26.784674 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-n4q7j_d08d6c80-4c67-4cea-8f12-12b55c526b6d/konnectivity-agent/0.log" Apr 17 21:30:26.833114 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:26.833085 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-198.ec2.internal_fdcbae07efe3e8c40fcc3f75d7de6766/haproxy/0.log" Apr 17 21:30:30.914850 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:30.914807 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6_254ffd77-476c-48c4-88ff-2f89b0847e9c/extract/0.log" Apr 17 21:30:30.937057 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:30.937005 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6_254ffd77-476c-48c4-88ff-2f89b0847e9c/util/0.log" Apr 17 21:30:30.960319 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:30.960289 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759spkn6_254ffd77-476c-48c4-88ff-2f89b0847e9c/pull/0.log" Apr 17 21:30:30.986699 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:30.986641 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v_4b07c673-6251-43f9-9720-fcb0e598d9b6/extract/0.log" Apr 17 21:30:31.012094 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.012008 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v_4b07c673-6251-43f9-9720-fcb0e598d9b6/util/0.log" Apr 17 21:30:31.031120 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.031089 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0b9g9v_4b07c673-6251-43f9-9720-fcb0e598d9b6/pull/0.log" Apr 17 21:30:31.062716 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.062681 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq_c2358ee3-70da-40bd-a51a-ccce72501936/extract/0.log" Apr 17 21:30:31.087090 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.087065 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq_c2358ee3-70da-40bd-a51a-ccce72501936/util/0.log" Apr 17 21:30:31.115375 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.115348 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73fdwgq_c2358ee3-70da-40bd-a51a-ccce72501936/pull/0.log" Apr 17 21:30:31.148382 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.148350 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8_1f396d01-3669-4f17-bdd9-3cd350725333/extract/0.log" Apr 17 21:30:31.173197 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.173104 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8_1f396d01-3669-4f17-bdd9-3cd350725333/util/0.log" Apr 17 21:30:31.211946 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.211920 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18b7x8_1f396d01-3669-4f17-bdd9-3cd350725333/pull/0.log" Apr 17 21:30:31.340206 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.340179 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-xk9z6_acc732b9-d67a-449d-9ad2-4013a3acc44f/kuadrant-console-plugin/0.log" Apr 17 21:30:31.370069 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.370034 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-hx8gd_0374bd95-cb75-4964-be0a-c883b7f390e7/registry-server/0.log" Apr 17 21:30:31.418568 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.418535 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-tpr4k_29037a2d-15a2-4a4c-a508-f6d7f105e4ab/manager/0.log" Apr 17 21:30:31.443847 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:31.443753 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-pdljg_72e0799e-e01f-4ce4-a98a-f3cdd864a0bd/limitador/0.log" Apr 17 21:30:32.959640 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:32.959608 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fc92d74d-3f8d-47c4-9c67-8d411bad18a3/alertmanager/0.log" Apr 17 21:30:32.987827 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:32.987798 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fc92d74d-3f8d-47c4-9c67-8d411bad18a3/config-reloader/0.log" Apr 17 21:30:33.010128 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.010094 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fc92d74d-3f8d-47c4-9c67-8d411bad18a3/kube-rbac-proxy-web/0.log" Apr 17 21:30:33.031999 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.031970 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fc92d74d-3f8d-47c4-9c67-8d411bad18a3/kube-rbac-proxy/0.log" Apr 17 21:30:33.062137 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.062111 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fc92d74d-3f8d-47c4-9c67-8d411bad18a3/kube-rbac-proxy-metric/0.log" Apr 17 21:30:33.093431 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.093401 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fc92d74d-3f8d-47c4-9c67-8d411bad18a3/prom-label-proxy/0.log" Apr 17 21:30:33.115560 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.115465 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fc92d74d-3f8d-47c4-9c67-8d411bad18a3/init-config-reloader/0.log" Apr 17 21:30:33.153767 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.153737 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-7wmg7_699d9724-833b-4266-b2d6-0ae0369b1d91/cluster-monitoring-operator/0.log" Apr 17 21:30:33.182414 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.182380 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-lvpjg_a4522390-be08-4472-b1b5-4d1db64090fb/kube-state-metrics/0.log" Apr 17 21:30:33.209858 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.209827 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-lvpjg_a4522390-be08-4472-b1b5-4d1db64090fb/kube-rbac-proxy-main/0.log" Apr 17 21:30:33.234486 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.234451 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-lvpjg_a4522390-be08-4472-b1b5-4d1db64090fb/kube-rbac-proxy-self/0.log" Apr 17 21:30:33.264175 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.264143 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-578cd5b9d8-6lxxg_f902dfc2-4680-4303-9548-92e70e5538b0/metrics-server/0.log" Apr 17 21:30:33.325532 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.325419 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-27ts9_7bc9046e-be5c-4615-af31-2fa594c57289/node-exporter/0.log" Apr 17 21:30:33.344239 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.344211 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-27ts9_7bc9046e-be5c-4615-af31-2fa594c57289/kube-rbac-proxy/0.log" Apr 17 21:30:33.365727 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.365645 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-27ts9_7bc9046e-be5c-4615-af31-2fa594c57289/init-textfile/0.log" Apr 17 21:30:33.843916 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.843884 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85cd5dfb97-ddpld_075cee51-892e-4c92-90cb-bc43f3a6c219/telemeter-client/0.log" Apr 17 21:30:33.865716 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.865682 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85cd5dfb97-ddpld_075cee51-892e-4c92-90cb-bc43f3a6c219/reload/0.log" Apr 17 21:30:33.908500 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:33.908466 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-85cd5dfb97-ddpld_075cee51-892e-4c92-90cb-bc43f3a6c219/kube-rbac-proxy/0.log" Apr 17 21:30:35.228199 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.228164 2567 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z"] Apr 17 21:30:35.233840 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.233815 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.240272 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.240241 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z"] Apr 17 21:30:35.320118 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.320075 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4pch\" (UniqueName: \"kubernetes.io/projected/162ae60d-cbb4-498c-91a5-d951415c6fb7-kube-api-access-m4pch\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.320440 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.320418 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-lib-modules\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.320695 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.320676 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-podres\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.320840 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.320826 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-sys\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.321035 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.321020 2567 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-proc\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422155 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422114 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-sys\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422339 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422242 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-sys\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422339 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422243 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-proc\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422339 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422303 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-proc\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422549 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422355 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4pch\" (UniqueName: \"kubernetes.io/projected/162ae60d-cbb4-498c-91a5-d951415c6fb7-kube-api-access-m4pch\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422549 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422386 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-lib-modules\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422549 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422440 2567 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-podres\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422740 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422597 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-podres\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.422740 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.422612 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/162ae60d-cbb4-498c-91a5-d951415c6fb7-lib-modules\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.432420 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.432388 2567 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4pch\" (UniqueName: \"kubernetes.io/projected/162ae60d-cbb4-498c-91a5-d951415c6fb7-kube-api-access-m4pch\") pod \"perf-node-gather-daemonset-6cl8z\" (UID: \"162ae60d-cbb4-498c-91a5-d951415c6fb7\") " pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.546135 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.546045 2567 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:35.604312 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.604279 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/2.log" Apr 17 21:30:35.612205 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.612139 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2d498_c7d6b966-299f-473e-b704-4ce1b867b0b5/console-operator/3.log" Apr 17 21:30:35.730278 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.730232 2567 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z"] Apr 17 21:30:35.903127 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:35.903086 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" event={"ID":"162ae60d-cbb4-498c-91a5-d951415c6fb7","Type":"ContainerStarted","Data":"5e831b7385e6fcda19b0489f1f20ee14824bb6840f3c0a0a1d7e7808f187fe96"} Apr 17 21:30:36.092036 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:36.091999 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b9fdff79-95ddb_85c07886-1c15-4979-8e03-c7c45b836fa5/console/0.log" Apr 17 21:30:36.121161 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:36.121121 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-tn47b_2d3a0e1b-9289-4b0f-8208-4ce9ce9a9531/download-server/0.log" Apr 17 21:30:36.635928 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:36.635891 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-8f5t4_c2b61a7e-5701-47f5-9c33-420684fa3f8d/volume-data-source-validator/0.log" Apr 17 21:30:36.910835 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:36.910743 2567 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" event={"ID":"162ae60d-cbb4-498c-91a5-d951415c6fb7","Type":"ContainerStarted","Data":"d61feb76419a761d0cb4c42a3c4e3cfd7de3b9a3c0be5b0401bd5f7e2d2b28e3"} Apr 17 21:30:36.911006 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:36.910843 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:36.927870 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:36.927807 2567 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" podStartSLOduration=1.927789664 podStartE2EDuration="1.927789664s" podCreationTimestamp="2026-04-17 21:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:30:36.92687251 +0000 UTC m=+992.496385289" watchObservedRunningTime="2026-04-17 21:30:36.927789664 +0000 UTC m=+992.497302418" Apr 17 21:30:37.499837 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:37.499812 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4d6rl_49118387-7ece-4934-bcd2-c3a2447f3933/dns/0.log" Apr 17 21:30:37.522157 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:37.522123 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4d6rl_49118387-7ece-4934-bcd2-c3a2447f3933/kube-rbac-proxy/0.log" Apr 17 21:30:37.748921 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:37.748893 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-khzwm_e067c0b5-668b-46b2-855d-e7cc2d5b9db4/dns-node-resolver/0.log" Apr 17 21:30:38.245627 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:38.245587 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-76c8559dfb-m94ms_97a746ea-9b60-4490-927e-dcdccb81be88/registry/0.log" Apr 17 21:30:38.290706 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:38.290672 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kv4pm_5b17b894-59e3-497a-832e-05720d6d30d8/node-ca/0.log" Apr 17 21:30:39.104802 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:39.104767 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf29wcr_438ec947-206d-49a2-af2a-f80a0d472353/istio-proxy/0.log" Apr 17 21:30:39.253295 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:39.253264 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-2fdt9_03621f00-c1ff-4311-870e-48851bec8851/istio-proxy/0.log" Apr 17 21:30:39.795464 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:39.795434 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-l25qj_46806929-68d7-4f2b-a1a4-39799c177ba4/serve-healthcheck-canary/0.log" Apr 17 21:30:40.345500 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:40.345469 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9j7wq_2d200915-24d1-46aa-9b70-20c8ff4392cb/kube-rbac-proxy/0.log" Apr 17 21:30:40.389410 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:40.389377 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9j7wq_2d200915-24d1-46aa-9b70-20c8ff4392cb/exporter/0.log" Apr 17 21:30:40.438441 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:40.438409 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9j7wq_2d200915-24d1-46aa-9b70-20c8ff4392cb/extractor/0.log" Apr 17 21:30:42.785428 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:42.785391 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-694fdf7c65-95kfc_463b17e4-2b3e-46d5-affc-05862505d3ba/manager/0.log" Apr 17 21:30:42.926926 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:42.926898 2567 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-w64ck/perf-node-gather-daemonset-6cl8z" Apr 17 21:30:44.191922 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:44.191892 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-54f8864c6c-4wj92_348290e3-f3af-4dd5-9881-2b6543bd7481/manager/0.log" Apr 17 21:30:48.556633 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:48.556601 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w92gw_e2a830a3-2453-4387-872f-788fbca4b588/migrator/0.log" Apr 17 21:30:48.574594 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:48.574564 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-w92gw_e2a830a3-2453-4387-872f-788fbca4b588/graceful-termination/0.log" Apr 17 21:30:50.155350 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.155325 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xl898_6d1fc6dd-6533-4846-82a8-55fc0feb006f/kube-multus-additional-cni-plugins/0.log" Apr 17 21:30:50.176776 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.176746 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xl898_6d1fc6dd-6533-4846-82a8-55fc0feb006f/egress-router-binary-copy/0.log" Apr 17 21:30:50.194817 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.194783 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xl898_6d1fc6dd-6533-4846-82a8-55fc0feb006f/cni-plugins/0.log" Apr 17 21:30:50.212745 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.212724 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xl898_6d1fc6dd-6533-4846-82a8-55fc0feb006f/bond-cni-plugin/0.log" Apr 17 21:30:50.231503 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.231479 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xl898_6d1fc6dd-6533-4846-82a8-55fc0feb006f/routeoverride-cni/0.log" Apr 17 21:30:50.250192 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.250166 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xl898_6d1fc6dd-6533-4846-82a8-55fc0feb006f/whereabouts-cni-bincopy/0.log" Apr 17 21:30:50.268393 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.268358 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xl898_6d1fc6dd-6533-4846-82a8-55fc0feb006f/whereabouts-cni/0.log" Apr 17 21:30:50.368309 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.368277 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nqctl_9b32da7b-3f9e-431b-b417-68d5b307bbd0/kube-multus/0.log" Apr 17 21:30:50.391070 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.391040 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ndfzt_95290018-54bc-46b1-8b24-b0bae6086a51/network-metrics-daemon/0.log" Apr 17 21:30:50.407136 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:50.407068 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ndfzt_95290018-54bc-46b1-8b24-b0bae6086a51/kube-rbac-proxy/0.log" Apr 17 21:30:51.627846 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.627813 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-controller/0.log" Apr 17 21:30:51.642930 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.642898 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/0.log" Apr 17 21:30:51.651750 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.651725 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovn-acl-logging/1.log" Apr 17 21:30:51.674809 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.674781 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/kube-rbac-proxy-node/0.log" Apr 17 21:30:51.694256 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.694228 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 21:30:51.710018 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.709990 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/northd/0.log" Apr 17 21:30:51.728203 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.728178 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/nbdb/0.log" Apr 17 21:30:51.746713 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.746689 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/sbdb/0.log" Apr 17 21:30:51.919144 ip-10-0-134-198 kubenswrapper[2567]: I0417 21:30:51.919067 2567 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hn2f5_2dfc2041-df04-460a-9385-f9b334671d62/ovnkube-controller/0.log"