Apr 17 20:01:33.050587 ip-10-0-130-159 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 20:01:33.050601 ip-10-0-130-159 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 20:01:33.050611 ip-10-0-130-159 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 20:01:33.051079 ip-10-0-130-159 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 20:01:43.202558 ip-10-0-130-159 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 20:01:43.202576 ip-10-0-130-159 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bf01ab44f4194c7cbf36d619c6317505 -- Apr 17 20:04:11.531740 ip-10-0-130-159 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:04:12.005637 ip-10-0-130-159 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:04:12.005637 ip-10-0-130-159 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:04:12.005637 ip-10-0-130-159 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:04:12.005637 ip-10-0-130-159 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:04:12.005637 ip-10-0-130-159 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:04:12.008886 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.008786 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:04:12.011927 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011903 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:12.011927 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011925 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:12.011927 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011929 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:12.011927 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011932 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011937 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011942 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011946 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011949 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011952 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011956 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011959 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011961 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011964 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011967 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011969 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011972 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011975 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011977 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011980 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011982 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011985 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011988 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011996 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:12.012064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.011999 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012002 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012005 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012007 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012010 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012012 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012015 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012018 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012021 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012024 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012029 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012032 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012036 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012039 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012043 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012046 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012049 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012052 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012055 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:12.012568 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012078 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012082 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012085 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012088 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012091 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012094 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012097 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012100 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012103 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012106 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012108 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012111 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012113 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012116 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012118 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012121 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012124 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012133 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012137 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012139 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:12.013088 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012142 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012145 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012148 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012150 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012153 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012156 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012159 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012162 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012165 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012168 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012171 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012174 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012177 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012180 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012183 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012185 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012189 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012192 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012194 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:12.013589 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012197 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012200 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012202 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012205 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012208 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012689 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012695 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012700 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012705 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012708 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012711 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012714 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012716 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012719 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012722 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012725 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012728 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012730 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012734 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:12.014058 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012737 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012740 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012744 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012747 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012750 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012752 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012755 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012758 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012760 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012763 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012765 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012768 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012771 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012773 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012776 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012779 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012781 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012784 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012787 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012789 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:12.014554 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012792 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012795 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012797 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012800 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012802 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012805 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012808 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012810 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012812 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012815 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012818 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012821 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012824 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012828 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012831 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012835 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012839 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012842 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012845 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:12.015057 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012847 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012850 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012853 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012855 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012858 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012860 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012864 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012867 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012869 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012872 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012875 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012877 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012880 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012882 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012885 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012888 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012890 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012892 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012895 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012898 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:12.015593 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012901 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012904 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012906 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012909 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012912 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012915 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012918 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012921 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012924 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012926 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012929 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012931 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.012934 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013640 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013658 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013666 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013671 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013676 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013680 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013685 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013690 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013693 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:04:12.016082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013697 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013701 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013704 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013708 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013711 2568 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013714 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013718 2568 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013722 2568 flags.go:64] FLAG: --cloud-config="" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013725 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013728 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013733 2568 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013736 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013740 2568 flags.go:64] FLAG: --config-dir="" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013743 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013747 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013750 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013754 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013757 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013760 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013763 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013767 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013771 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013774 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013777 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013782 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:04:12.016626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013785 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013788 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013791 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013794 2568 flags.go:64] FLAG: --enable-server="true" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013797 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013802 2568 flags.go:64] FLAG: --event-burst="100" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013805 2568 flags.go:64] FLAG: --event-qps="50" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013809 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013812 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013815 2568 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013819 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013823 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013826 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013833 2568 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013837 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013840 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013843 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013859 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013863 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013867 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013870 2568 flags.go:64] FLAG: --feature-gates="" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013875 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013878 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013882 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013886 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013890 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:04:12.017245 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013893 2568 flags.go:64] FLAG: --help="false" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013896 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013900 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013904 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013907 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013910 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013914 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013917 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013920 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013923 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013926 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013929 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013932 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013935 2568 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013938 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013941 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013945 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013947 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013950 2568 flags.go:64] FLAG: --lock-file="" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013955 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013959 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013962 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013968 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:04:12.017894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013971 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013974 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013977 2568 flags.go:64] FLAG: --logging-format="text" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013981 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013984 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013987 2568 flags.go:64] FLAG: --manifest-url="" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013991 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013995 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.013999 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014003 2568 flags.go:64] FLAG: --max-pods="110" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014006 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014009 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014013 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014017 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014020 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014023 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014026 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014036 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014039 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014042 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014045 2568 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014051 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014057 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014061 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:04:12.018491 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014064 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014067 2568 flags.go:64] FLAG: --port="10250" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014071 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014074 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-017cfad4ab717b73a" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014077 2568 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014081 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014084 2568 flags.go:64] FLAG: --register-node="true" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014087 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014090 2568 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014094 2568 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014097 2568 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014100 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014102 2568 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014106 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014109 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014113 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014116 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014119 2568 flags.go:64] FLAG: --runonce="false" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014122 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014126 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014129 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014140 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014143 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014147 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014150 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014153 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:04:12.019170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014156 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014159 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014162 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014167 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014170 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014173 2568 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014176 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014182 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014185 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014187 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014192 2568 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014195 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014198 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014202 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014205 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014208 2568 flags.go:64] FLAG: --v="2" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014213 2568 flags.go:64] FLAG: --version="false" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014217 2568 flags.go:64] FLAG: --vmodule="" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014221 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014225 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014326 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014330 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014333 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014337 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:12.019844 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014340 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014343 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014346 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014349 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014352 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014355 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014358 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014361 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014364 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014366 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014369 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014373 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014375 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014378 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014381 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014383 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014386 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014388 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014391 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014394 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:12.020769 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014411 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014414 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014416 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014419 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014422 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014424 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014427 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014430 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014433 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014435 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014438 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014441 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014443 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014446 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014448 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014451 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014454 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014457 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014459 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014462 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:12.021723 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014464 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014467 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014470 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014474 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014476 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014479 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014481 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014484 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014487 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014489 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014493 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014497 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014500 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014503 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014506 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014508 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014511 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014513 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014516 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:12.022426 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014518 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014520 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014523 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014526 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014528 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014531 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014535 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014538 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014541 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014544 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014547 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014550 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014552 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014555 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014558 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014561 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014566 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014569 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014571 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:12.022920 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014574 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014576 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014579 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.014583 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.014591 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.023033 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.023065 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023147 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023156 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023162 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023166 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023171 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023176 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023180 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023185 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:12.023697 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023189 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023194 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023198 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023203 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023207 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023213 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023217 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023222 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023226 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023230 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023234 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023238 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023243 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023247 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023251 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023255 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023259 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023264 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023269 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023275 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:12.024348 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023284 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023289 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023295 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023302 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023308 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023314 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023318 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023322 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023327 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023331 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023335 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023339 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023343 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023347 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023351 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023356 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023360 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023365 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023369 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023374 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:12.024889 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023378 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023382 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023386 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023391 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023411 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023416 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023421 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023427 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023433 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023438 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023442 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023446 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023450 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023454 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023458 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023463 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023470 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023474 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023479 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:12.025491 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023483 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023487 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023491 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023495 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023501 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023505 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023509 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023514 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023518 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023522 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023527 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023531 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023535 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023539 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023544 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023548 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023553 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023558 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:12.026224 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023562 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.023570 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023740 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023750 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023757 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023764 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023769 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023774 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023779 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023783 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023788 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023793 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023798 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023802 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:04:12.027074 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023807 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023811 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023815 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023820 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023824 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023829 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023833 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023837 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023841 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023845 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023849 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023854 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023858 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023862 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023866 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023871 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023875 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023879 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023883 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023887 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:04:12.027591 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023892 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023896 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023900 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023905 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023909 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023913 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023917 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023922 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023928 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023934 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023939 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023944 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023948 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023953 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023957 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023962 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023966 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023970 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023974 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:04:12.028220 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023978 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023982 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023987 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023991 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023995 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.023999 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024003 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024008 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024012 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024016 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024020 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024024 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024028 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024032 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024037 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024041 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024046 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024050 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024055 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024059 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:04:12.028804 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024063 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024067 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024071 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024075 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024080 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024084 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024088 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024093 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024097 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024102 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024106 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024110 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024114 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024118 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:12.024122 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.024130 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:04:12.029356 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.025003 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:04:12.029776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.027919 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:04:12.029776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.029062 2568 server.go:1019] "Starting client certificate rotation" Apr 17 20:04:12.029776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.029164 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:04:12.029776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.029203 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:04:12.058751 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.058712 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:04:12.063705 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.063679 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:04:12.076155 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.076127 2568 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:04:12.081807 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.081785 2568 log.go:25] "Validated CRI v1 image API" Apr 17 20:04:12.084332 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.084308 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:04:12.084684 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.084668 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:04:12.088121 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.088088 2568 fs.go:135] Filesystem UUIDs: map[519f9e6c-b491-4abb-b266-29ae6d55f768:/dev/nvme0n1p3 59b6c5be-969f-4963-be36-71a6fe26d636:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 17 20:04:12.088121 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.088116 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:04:12.095241 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.095107 2568 manager.go:217] Machine: {Timestamp:2026-04-17 20:04:12.093162711 +0000 UTC m=+0.433317056 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2499994 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2890cec529f79b2683b6bd52f8c3d0 SystemUUID:ec2890ce-c529-f79b-2683-b6bd52f8c3d0 BootID:bf01ab44-f419-4c7c-bf36-d619c6317505 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b3:56:be:7b:3f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b3:56:be:7b:3f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:7e:51:b5:78:0c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:04:12.095241 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.095229 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:04:12.095359 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.095329 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:04:12.097388 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.097355 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:04:12.097555 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.097390 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-159.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:04:12.097606 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.097567 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:04:12.097606 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.097576 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:04:12.097606 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.097589 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:04:12.097606 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.097602 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:04:12.099249 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.099237 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:04:12.099385 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.099375 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:04:12.102007 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.101996 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:04:12.102068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.102013 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:04:12.102068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.102025 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:04:12.102068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.102058 2568 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:04:12.102068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.102068 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:04:12.103310 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.103294 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:04:12.103419 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.103318 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:04:12.103945 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.103926 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sm95s" Apr 17 20:04:12.107717 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.107631 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:04:12.109767 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.109748 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:04:12.111025 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111011 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:04:12.111089 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111037 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:04:12.111089 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111048 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:04:12.111089 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111056 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:04:12.111089 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111065 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:04:12.111089 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111075 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:04:12.111089 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111083 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:04:12.111260 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111092 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:04:12.111260 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111104 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:04:12.111260 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111113 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:04:12.111260 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111126 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:04:12.111260 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111139 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:04:12.111536 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111519 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sm95s" Apr 17 20:04:12.111826 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111815 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:04:12.111826 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.111827 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:04:12.114391 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.114366 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-159.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:04:12.114496 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.114382 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-159.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:04:12.114496 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.114427 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:04:12.116692 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.116678 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:04:12.116774 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.116721 2568 server.go:1295] "Started kubelet" Apr 17 20:04:12.116849 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.116816 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:04:12.116907 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.116815 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:04:12.116907 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.116894 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:04:12.117635 ip-10-0-130-159 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:04:12.119518 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.119504 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:04:12.121132 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.121116 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:04:12.126763 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.126745 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:04:12.127502 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.127486 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:04:12.128068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.128048 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:04:12.128683 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.128661 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:04:12.128683 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.128664 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:04:12.128812 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.128692 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:04:12.128812 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.128805 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:04:12.128896 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.128815 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:04:12.129113 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.129093 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.130371 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.130353 2568 factory.go:55] Registering systemd factory Apr 17 20:04:12.130371 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.130363 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:12.130532 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.130376 2568 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:04:12.130697 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.130681 2568 factory.go:153] Registering CRI-O factory Apr 17 20:04:12.130739 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.130705 2568 factory.go:223] Registration of the crio container factory successfully Apr 17 20:04:12.130798 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.130777 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:04:12.130829 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.130812 2568 factory.go:103] Registering Raw factory Apr 17 20:04:12.130924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.130908 2568 manager.go:1196] Started watching for new ooms in manager Apr 17 20:04:12.131532 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.131514 2568 manager.go:319] Starting recovery of all containers Apr 17 20:04:12.137000 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.136971 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-159.ec2.internal\" not found" node="ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.143276 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.143094 2568 manager.go:324] Recovery completed Apr 17 20:04:12.147992 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.147977 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:12.150692 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.150674 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:12.150763 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.150705 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:12.150763 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.150716 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:12.151211 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.151195 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:04:12.151284 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.151211 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:04:12.151284 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.151234 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:04:12.153515 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.153501 2568 policy_none.go:49] "None policy: Start" Apr 17 20:04:12.153595 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.153521 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:04:12.153595 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.153534 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.191520 2568 manager.go:341] "Starting Device Plugin manager" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.191554 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.191567 2568 server.go:85] "Starting device plugin registration server" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.191829 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.191841 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.191942 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.192016 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.192023 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.192776 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:04:12.198591 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.192821 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.261107 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.261009 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:04:12.262361 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.262346 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:04:12.262467 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.262376 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:04:12.262467 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.262415 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:04:12.262467 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.262425 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:04:12.262582 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.262471 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:04:12.264586 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.264561 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:12.292417 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.292349 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:12.293347 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.293332 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:12.293442 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.293364 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:12.293442 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.293374 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:12.293442 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.293415 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.302374 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.302357 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.302453 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.302383 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-159.ec2.internal\": node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.316611 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.316584 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.363091 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.363050 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal"] Apr 17 20:04:12.363258 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.363201 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:12.364213 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.364197 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:12.364291 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.364228 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:12.364291 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.364241 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:12.365791 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.365778 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:12.365942 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.365927 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.365993 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.365957 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:12.366591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.366574 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:12.366670 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.366604 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:12.366670 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.366613 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:12.366670 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.366578 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:12.366764 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.366683 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:12.366764 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.366697 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:12.367834 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.367821 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.367875 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.367844 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:04:12.368564 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.368547 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:04:12.368638 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.368577 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:04:12.368638 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.368589 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:04:12.385030 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.385001 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-159.ec2.internal\" not found" node="ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.389254 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.389233 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-159.ec2.internal\" not found" node="ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.417292 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.417261 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.430744 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.430715 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6e89466f546d82f5fc8e46ec06064587-config\") pod \"kube-apiserver-proxy-ip-10-0-130-159.ec2.internal\" (UID: \"6e89466f546d82f5fc8e46ec06064587\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.430850 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.430749 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1fd0626565d1684df72afda7787b2e7f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal\" (UID: \"1fd0626565d1684df72afda7787b2e7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.430850 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.430774 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fd0626565d1684df72afda7787b2e7f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal\" (UID: \"1fd0626565d1684df72afda7787b2e7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.517834 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.517746 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.531127 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.531097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1fd0626565d1684df72afda7787b2e7f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal\" (UID: \"1fd0626565d1684df72afda7787b2e7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.531239 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.531136 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fd0626565d1684df72afda7787b2e7f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal\" (UID: \"1fd0626565d1684df72afda7787b2e7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.531239 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.531206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1fd0626565d1684df72afda7787b2e7f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal\" (UID: \"1fd0626565d1684df72afda7787b2e7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.531320 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.531254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6e89466f546d82f5fc8e46ec06064587-config\") pod \"kube-apiserver-proxy-ip-10-0-130-159.ec2.internal\" (UID: \"6e89466f546d82f5fc8e46ec06064587\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.531320 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.531271 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1fd0626565d1684df72afda7787b2e7f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal\" (UID: \"1fd0626565d1684df72afda7787b2e7f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.531320 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.531284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6e89466f546d82f5fc8e46ec06064587-config\") pod \"kube-apiserver-proxy-ip-10-0-130-159.ec2.internal\" (UID: \"6e89466f546d82f5fc8e46ec06064587\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.618609 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.618577 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.687075 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.687049 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.691771 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:12.691746 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" Apr 17 20:04:12.719638 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.719605 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.820174 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.820068 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:12.920609 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:12.920568 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:13.021206 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:13.021167 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:13.028292 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.028269 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:04:13.028659 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.028486 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:04:13.028659 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.028488 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:04:13.114140 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.114059 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 19:59:12 +0000 UTC" deadline="2028-02-02 11:31:23.000702804 +0000 UTC" Apr 17 20:04:13.114140 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.114100 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15735h27m9.886607657s" Apr 17 20:04:13.121387 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:13.121358 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:13.127995 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.127971 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:04:13.137223 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.137198 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:04:13.158577 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.158548 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fhm7r" Apr 17 20:04:13.166331 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.166303 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fhm7r" Apr 17 20:04:13.221720 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:13.221691 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:13.227925 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:13.227838 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fd0626565d1684df72afda7787b2e7f.slice/crio-775b186cc5be12326d48191af1edd18340f408ecc63b8c4fa42b5173b4d9f76c WatchSource:0}: Error finding container 775b186cc5be12326d48191af1edd18340f408ecc63b8c4fa42b5173b4d9f76c: Status 404 returned error can't find the container with id 775b186cc5be12326d48191af1edd18340f408ecc63b8c4fa42b5173b4d9f76c Apr 17 20:04:13.228295 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:13.228269 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e89466f546d82f5fc8e46ec06064587.slice/crio-fa925ff79574de2cee31f5e68610f5416f834e8d9aca2835c34f0572085c8747 WatchSource:0}: Error finding container fa925ff79574de2cee31f5e68610f5416f834e8d9aca2835c34f0572085c8747: Status 404 returned error can't find the container with id fa925ff79574de2cee31f5e68610f5416f834e8d9aca2835c34f0572085c8747 Apr 17 20:04:13.232556 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.232535 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:04:13.266020 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.265965 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" event={"ID":"1fd0626565d1684df72afda7787b2e7f","Type":"ContainerStarted","Data":"775b186cc5be12326d48191af1edd18340f408ecc63b8c4fa42b5173b4d9f76c"} Apr 17 20:04:13.266842 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.266822 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" event={"ID":"6e89466f546d82f5fc8e46ec06064587","Type":"ContainerStarted","Data":"fa925ff79574de2cee31f5e68610f5416f834e8d9aca2835c34f0572085c8747"} Apr 17 20:04:13.322057 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:13.322018 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:13.422723 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:13.422637 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:13.523209 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:13.523152 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:13.624037 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:13.624002 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-159.ec2.internal\" not found" Apr 17 20:04:13.682365 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.682288 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:13.684328 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.684303 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:13.729201 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.729051 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" Apr 17 20:04:13.743584 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.743431 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:04:13.744722 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.744485 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" Apr 17 20:04:13.752811 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:13.752693 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:04:14.103156 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.103079 2568 apiserver.go:52] "Watching apiserver" Apr 17 20:04:14.112876 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.112813 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:04:14.113962 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.113933 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal","openshift-multus/multus-additional-cni-plugins-245q6","openshift-network-diagnostics/network-check-target-x94f6","openshift-network-operator/iptables-alerter-9d497","openshift-ovn-kubernetes/ovnkube-node-jqvld","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp","openshift-cluster-node-tuning-operator/tuned-ztjgz","openshift-dns/node-resolver-88qdj","openshift-multus/multus-rxx56","openshift-multus/network-metrics-daemon-2ctfd","kube-system/konnectivity-agent-6bfbp","kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal","openshift-image-registry/node-ca-gkjdr"] Apr 17 20:04:14.116130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.116105 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.117453 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.117432 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.118699 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.118674 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:04:14.118699 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.118687 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:04:14.118848 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.118750 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8zd92\"" Apr 17 20:04:14.119940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.119631 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:04:14.119940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.119661 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:04:14.119940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.119718 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dkw2q\"" Apr 17 20:04:14.119940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.119661 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:14.119940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.119665 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:04:14.119940 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.119803 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:14.119940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.119844 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:04:14.119940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.119861 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:04:14.120984 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.120964 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.122292 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.122273 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.122414 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.122357 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.123111 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.123089 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:04:14.123218 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.123203 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:04:14.123265 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.123241 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qrlkt\"" Apr 17 20:04:14.123311 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.123277 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:04:14.123599 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.123581 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.124289 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.124246 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:04:14.124686 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.124669 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.124936 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125041 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125116 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125166 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vgm4k\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125184 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125195 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.124943 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mwk7d\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125245 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:04:14.125441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125421 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:04:14.125890 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125729 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hbspv\"" Apr 17 20:04:14.125890 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.125732 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:04:14.126030 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.126009 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:04:14.126537 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.126519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.126627 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.126608 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:14.126736 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.126711 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:14.128062 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.128045 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:14.128459 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.128440 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:04:14.128556 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.128470 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5xzvn\"" Apr 17 20:04:14.129629 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.129608 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.130017 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.129976 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:04:14.130215 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.130195 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q6mkk\"" Apr 17 20:04:14.130358 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.130341 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:04:14.131761 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.131741 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:04:14.131858 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.131764 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:04:14.131920 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.131865 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:04:14.132016 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.131996 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7mfpt\"" Apr 17 20:04:14.140255 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysctl-d\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.140382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140268 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-cni-netd\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.140382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140302 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-os-release\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.140382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:14.140382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140347 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-socket-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.140382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140365 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-netns\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.140382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140380 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-hostroot\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.140647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140411 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-slash\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.140647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140429 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-systemd\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.140647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-log-socket\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.140647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140518 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-socket-dir-parent\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.140647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140548 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-var-lib-kubelet\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.140647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140573 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.140647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140598 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm7v9\" (UniqueName: \"kubernetes.io/projected/8beb86a8-efdf-4bba-8697-baf00c6854af-kube-api-access-lm7v9\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140655 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-run-ovn-kubernetes\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140691 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-os-release\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140755 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-k8s-cni-cncf-io\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-run\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140807 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87a4ec80-629d-4a66-8195-8fe5d60b43f9-host-slash\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92g9q\" (UniqueName: \"kubernetes.io/projected/32f70982-1fda-48ca-bbf7-530ff3957212-kube-api-access-92g9q\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140859 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-etc-kubernetes\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140884 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7470e9e7-7248-44cf-81a8-fc62c99d05b9-host\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140910 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-modprobe-d\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.140961 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.140949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-tuned\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.141487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141240 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-cni-bin\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.141487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141283 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.141487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141325 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a591f534-0100-4238-b0cc-81835de74e25-agent-certs\") pod \"konnectivity-agent-6bfbp\" (UID: \"a591f534-0100-4238-b0cc-81835de74e25\") " pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:14.141487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32f70982-1fda-48ca-bbf7-530ff3957212-ovn-node-metrics-cert\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.141487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141376 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-registration-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.141487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141429 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-kubelet\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.141487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7470e9e7-7248-44cf-81a8-fc62c99d05b9-serviceca\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.141487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-kubernetes\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141506 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysctl-conf\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141532 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141556 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-ovn\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141603 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-device-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfchq\" (UniqueName: \"kubernetes.io/projected/648545b8-d5e2-4491-9d4d-e78f3052aefb-kube-api-access-zfchq\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141679 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a591f534-0100-4238-b0cc-81835de74e25-konnectivity-ca\") pod \"konnectivity-agent-6bfbp\" (UID: \"a591f534-0100-4238-b0cc-81835de74e25\") " pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141722 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tmx9\" (UniqueName: \"kubernetes.io/projected/7470e9e7-7248-44cf-81a8-fc62c99d05b9-kube-api-access-9tmx9\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-env-overrides\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141806 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thfds\" (UniqueName: \"kubernetes.io/projected/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-kube-api-access-thfds\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141825 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-system-cni-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-cnibin\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-systemd\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141915 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-lib-modules\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.141956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141947 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-system-cni-dir\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141972 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-var-lib-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.141997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2vs\" (UniqueName: \"kubernetes.io/projected/a3207e4f-83f5-4913-a57e-c29dd6aed2df-kube-api-access-cb2vs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142039 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-cni-multus\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142059 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-etc-selinux\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-sys\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142098 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-systemd-units\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142120 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-run-netns\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-etc-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142180 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-cni-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142213 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-node-log\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142270 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-ovnkube-config\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142292 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-ovnkube-script-lib\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142317 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-daemon-config\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142340 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-multus-certs\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.142612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142365 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7szs\" (UniqueName: \"kubernetes.io/projected/d04fad23-2e01-4696-9db6-11106ea9ec13-kube-api-access-l7szs\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142406 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysconfig\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-tmp\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-cnibin\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-kubelet\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142491 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-sys-fs\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142525 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-hosts-file\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142540 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-tmp-dir\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142559 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-host\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142596 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-cni-binary-copy\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142630 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142685 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/87a4ec80-629d-4a66-8195-8fe5d60b43f9-iptables-alerter-script\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142723 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dks2b\" (UniqueName: \"kubernetes.io/projected/87a4ec80-629d-4a66-8195-8fe5d60b43f9-kube-api-access-dks2b\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142761 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142797 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/648545b8-d5e2-4491-9d4d-e78f3052aefb-cni-binary-copy\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142822 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-cni-bin\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.143290 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplsm\" (UniqueName: \"kubernetes.io/projected/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-kube-api-access-hplsm\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.143815 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.142876 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-conf-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.167187 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.167149 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 19:59:13 +0000 UTC" deadline="2027-10-19 18:18:25.593960891 +0000 UTC" Apr 17 20:04:14.167187 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.167181 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13198h14m11.426782692s" Apr 17 20:04:14.230271 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.230227 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:04:14.243246 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243206 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-modprobe-d\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.243246 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243247 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-tuned\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-cni-bin\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243296 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243321 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a591f534-0100-4238-b0cc-81835de74e25-agent-certs\") pod \"konnectivity-agent-6bfbp\" (UID: \"a591f534-0100-4238-b0cc-81835de74e25\") " pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32f70982-1fda-48ca-bbf7-530ff3957212-ovn-node-metrics-cert\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243383 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-modprobe-d\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-registration-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243454 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-kubelet\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243455 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243504 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7470e9e7-7248-44cf-81a8-fc62c99d05b9-serviceca\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.243524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243515 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-registration-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243534 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-kubernetes\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-kubelet\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysctl-conf\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243613 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-ovn\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-device-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfchq\" (UniqueName: \"kubernetes.io/projected/648545b8-d5e2-4491-9d4d-e78f3052aefb-kube-api-access-zfchq\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a591f534-0100-4238-b0cc-81835de74e25-konnectivity-ca\") pod \"konnectivity-agent-6bfbp\" (UID: \"a591f534-0100-4238-b0cc-81835de74e25\") " pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243697 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-cni-bin\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243716 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9tmx9\" (UniqueName: \"kubernetes.io/projected/7470e9e7-7248-44cf-81a8-fc62c99d05b9-kube-api-access-9tmx9\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-kubernetes\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243744 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-env-overrides\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thfds\" (UniqueName: \"kubernetes.io/projected/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-kube-api-access-thfds\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243794 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-system-cni-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243776 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243822 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-cnibin\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243846 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-systemd\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.244097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysctl-conf\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.244965 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.244380 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a591f534-0100-4238-b0cc-81835de74e25-konnectivity-ca\") pod \"konnectivity-agent-6bfbp\" (UID: \"a591f534-0100-4238-b0cc-81835de74e25\") " pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:14.244965 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.244468 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-env-overrides\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.244965 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.244475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-lib-modules\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.244965 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.243999 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7470e9e7-7248-44cf-81a8-fc62c99d05b9-serviceca\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.244965 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.244844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-system-cni-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-cnibin\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245058 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-device-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-systemd\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-ovn\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-system-cni-dir\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-var-lib-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245146 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-system-cni-dir\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245167 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2vs\" (UniqueName: \"kubernetes.io/projected/a3207e4f-83f5-4913-a57e-c29dd6aed2df-kube-api-access-cb2vs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245175 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-lib-modules\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245173 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-var-lib-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.245217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245196 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-cni-multus\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245231 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-cni-multus\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245238 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-etc-selinux\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-sys\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-systemd-units\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245316 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-run-netns\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-etc-selinux\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245346 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-sys\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245413 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-run-netns\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-systemd-units\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.245790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245533 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-etc-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-etc-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.245995 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246019 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-cni-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246045 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-node-log\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246057 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-openvswitch\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-ovnkube-config\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-ovnkube-script-lib\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-daemon-config\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246157 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-multus-certs\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246181 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7szs\" (UniqueName: \"kubernetes.io/projected/d04fad23-2e01-4696-9db6-11106ea9ec13-kube-api-access-l7szs\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246203 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysconfig\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246227 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-tmp\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246247 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-cnibin\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246270 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-kubelet\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.246295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-sys-fs\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246325 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-hosts-file\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246351 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-tmp-dir\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-host\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-cni-binary-copy\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246541 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246569 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/87a4ec80-629d-4a66-8195-8fe5d60b43f9-iptables-alerter-script\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246597 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-ovnkube-config\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246669 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-node-log\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dks2b\" (UniqueName: \"kubernetes.io/projected/87a4ec80-629d-4a66-8195-8fe5d60b43f9-kube-api-access-dks2b\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246708 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/648545b8-d5e2-4491-9d4d-e78f3052aefb-cni-binary-copy\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-cni-bin\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246809 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hplsm\" (UniqueName: \"kubernetes.io/projected/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-kube-api-access-hplsm\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-conf-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246859 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysctl-d\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-cni-netd\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.247008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246912 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-os-release\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246938 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-socket-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.246993 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-netns\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247018 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-hostroot\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247046 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-slash\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-systemd\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-log-socket\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247128 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-socket-dir-parent\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247157 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-var-lib-kubelet\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247184 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm7v9\" (UniqueName: \"kubernetes.io/projected/8beb86a8-efdf-4bba-8697-baf00c6854af-kube-api-access-lm7v9\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247230 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/32f70982-1fda-48ca-bbf7-530ff3957212-ovnkube-script-lib\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-run-ovn-kubernetes\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247265 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-os-release\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247301 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-cni-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.247799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-k8s-cni-cncf-io\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247357 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-netns\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247377 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-hostroot\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247434 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-slash\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247440 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-run-systemd\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247568 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-log-socket\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247617 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-socket-dir-parent\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247661 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-var-lib-kubelet\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247776 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysctl-d\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247785 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247826 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-var-lib-cni-bin\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-tuned\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247842 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/32f70982-1fda-48ca-bbf7-530ff3957212-ovn-node-metrics-cert\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247877 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-conf-dir\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247901 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-run\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247956 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-run\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.247965 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.247976 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-hosts-file\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.248640 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-multus-certs\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-os-release\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.248085 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs podName:a3207e4f-83f5-4913-a57e-c29dd6aed2df nodeName:}" failed. No retries permitted until 2026-04-17 20:04:14.748045761 +0000 UTC m=+3.088200103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs") pod "network-metrics-daemon-2ctfd" (UID: "a3207e4f-83f5-4913-a57e-c29dd6aed2df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-host-run-k8s-cni-cncf-io\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248164 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-host\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248236 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-etc-sysconfig\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248335 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a591f534-0100-4238-b0cc-81835de74e25-agent-certs\") pod \"konnectivity-agent-6bfbp\" (UID: \"a591f534-0100-4238-b0cc-81835de74e25\") " pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87a4ec80-629d-4a66-8195-8fe5d60b43f9-host-slash\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248414 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-run-ovn-kubernetes\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248448 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92g9q\" (UniqueName: \"kubernetes.io/projected/32f70982-1fda-48ca-bbf7-530ff3957212-kube-api-access-92g9q\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248463 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/648545b8-d5e2-4491-9d4d-e78f3052aefb-cni-binary-copy\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248475 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-kubelet\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248490 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-etc-kubernetes\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248518 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-cnibin\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/648545b8-d5e2-4491-9d4d-e78f3052aefb-etc-kubernetes\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248547 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8beb86a8-efdf-4bba-8697-baf00c6854af-os-release\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248552 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/32f70982-1fda-48ca-bbf7-530ff3957212-host-cni-netd\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.249429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/87a4ec80-629d-4a66-8195-8fe5d60b43f9-host-slash\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248617 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7470e9e7-7248-44cf-81a8-fc62c99d05b9-host\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248628 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-socket-dir\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248709 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d04fad23-2e01-4696-9db6-11106ea9ec13-sys-fs\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248783 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/87a4ec80-629d-4a66-8195-8fe5d60b43f9-iptables-alerter-script\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.248841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7470e9e7-7248-44cf-81a8-fc62c99d05b9-host\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.249074 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-tmp-dir\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.249107 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8beb86a8-efdf-4bba-8697-baf00c6854af-cni-binary-copy\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.250012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.249215 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/648545b8-d5e2-4491-9d4d-e78f3052aefb-multus-daemon-config\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.250415 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.250302 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-tmp\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.252536 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.252509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfchq\" (UniqueName: \"kubernetes.io/projected/648545b8-d5e2-4491-9d4d-e78f3052aefb-kube-api-access-zfchq\") pod \"multus-rxx56\" (UID: \"648545b8-d5e2-4491-9d4d-e78f3052aefb\") " pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.253030 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.253008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tmx9\" (UniqueName: \"kubernetes.io/projected/7470e9e7-7248-44cf-81a8-fc62c99d05b9-kube-api-access-9tmx9\") pod \"node-ca-gkjdr\" (UID: \"7470e9e7-7248-44cf-81a8-fc62c99d05b9\") " pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.253030 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.253022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thfds\" (UniqueName: \"kubernetes.io/projected/316b73ef-655f-4979-a7b1-dcaf0e3bb3ad-kube-api-access-thfds\") pod \"node-resolver-88qdj\" (UID: \"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad\") " pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.253687 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.253662 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2vs\" (UniqueName: \"kubernetes.io/projected/a3207e4f-83f5-4913-a57e-c29dd6aed2df-kube-api-access-cb2vs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:14.258411 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.258375 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:14.258547 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.258423 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:14.258547 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.258439 2568 projected.go:194] Error preparing data for projected volume kube-api-access-98jpl for pod openshift-network-diagnostics/network-check-target-x94f6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:14.258547 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.258527 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl podName:9cf5bd58-f267-46f8-9af8-24426ecf56e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:14.758494249 +0000 UTC m=+3.098648596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-98jpl" (UniqueName: "kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl") pod "network-check-target-x94f6" (UID: "9cf5bd58-f267-46f8-9af8-24426ecf56e0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:14.261278 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.261251 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92g9q\" (UniqueName: \"kubernetes.io/projected/32f70982-1fda-48ca-bbf7-530ff3957212-kube-api-access-92g9q\") pod \"ovnkube-node-jqvld\" (UID: \"32f70982-1fda-48ca-bbf7-530ff3957212\") " pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.261618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.261597 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dks2b\" (UniqueName: \"kubernetes.io/projected/87a4ec80-629d-4a66-8195-8fe5d60b43f9-kube-api-access-dks2b\") pod \"iptables-alerter-9d497\" (UID: \"87a4ec80-629d-4a66-8195-8fe5d60b43f9\") " pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.261826 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.261804 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplsm\" (UniqueName: \"kubernetes.io/projected/eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd-kube-api-access-hplsm\") pod \"tuned-ztjgz\" (UID: \"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd\") " pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.262062 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.262038 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm7v9\" (UniqueName: \"kubernetes.io/projected/8beb86a8-efdf-4bba-8697-baf00c6854af-kube-api-access-lm7v9\") pod \"multus-additional-cni-plugins-245q6\" (UID: \"8beb86a8-efdf-4bba-8697-baf00c6854af\") " pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.262277 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.262260 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7szs\" (UniqueName: \"kubernetes.io/projected/d04fad23-2e01-4696-9db6-11106ea9ec13-kube-api-access-l7szs\") pod \"aws-ebs-csi-driver-node-6cdwp\" (UID: \"d04fad23-2e01-4696-9db6-11106ea9ec13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.386270 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.386191 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:14.429263 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.429222 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" Apr 17 20:04:14.437320 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.437282 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-245q6" Apr 17 20:04:14.446751 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.446718 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-9d497" Apr 17 20:04:14.450794 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.450773 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:14.457528 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.457496 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" Apr 17 20:04:14.464278 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.464250 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-88qdj" Apr 17 20:04:14.472116 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.472085 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rxx56" Apr 17 20:04:14.478980 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.478940 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:14.484703 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.484660 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gkjdr" Apr 17 20:04:14.609005 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.608963 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:04:14.752821 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.752724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:14.752989 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.752850 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:14.752989 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.752923 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs podName:a3207e4f-83f5-4913-a57e-c29dd6aed2df nodeName:}" failed. No retries permitted until 2026-04-17 20:04:15.75290518 +0000 UTC m=+4.093059509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs") pod "network-metrics-daemon-2ctfd" (UID: "a3207e4f-83f5-4913-a57e-c29dd6aed2df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:14.853629 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:14.853596 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:14.853794 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.853774 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:14.853845 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.853800 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:14.853845 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.853811 2568 projected.go:194] Error preparing data for projected volume kube-api-access-98jpl for pod openshift-network-diagnostics/network-check-target-x94f6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:14.853906 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:14.853864 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl podName:9cf5bd58-f267-46f8-9af8-24426ecf56e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:15.853848957 +0000 UTC m=+4.194003286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-98jpl" (UniqueName: "kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl") pod "network-check-target-x94f6" (UID: "9cf5bd58-f267-46f8-9af8-24426ecf56e0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:14.892888 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.892859 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8beb86a8_efdf_4bba_8697_baf00c6854af.slice/crio-2298ed8f1bcaac9274e3023e5c38b63743f776fae5c6a9a8dc14276dcc0b3222 WatchSource:0}: Error finding container 2298ed8f1bcaac9274e3023e5c38b63743f776fae5c6a9a8dc14276dcc0b3222: Status 404 returned error can't find the container with id 2298ed8f1bcaac9274e3023e5c38b63743f776fae5c6a9a8dc14276dcc0b3222 Apr 17 20:04:14.894235 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.894206 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod648545b8_d5e2_4491_9d4d_e78f3052aefb.slice/crio-f841b6a9d25254cb1af912a1bb8c4929f21ebb7a171bde3839e435ef3abfe142 WatchSource:0}: Error finding container f841b6a9d25254cb1af912a1bb8c4929f21ebb7a171bde3839e435ef3abfe142: Status 404 returned error can't find the container with id f841b6a9d25254cb1af912a1bb8c4929f21ebb7a171bde3839e435ef3abfe142 Apr 17 20:04:14.895479 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.895458 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda591f534_0100_4238_b0cc_81835de74e25.slice/crio-4c59e372f8681d8dd131b3ce2534d8571d6528996bda3d26ed1fe04ecfeb462c WatchSource:0}: Error finding container 4c59e372f8681d8dd131b3ce2534d8571d6528996bda3d26ed1fe04ecfeb462c: Status 404 returned error can't find the container with id 4c59e372f8681d8dd131b3ce2534d8571d6528996bda3d26ed1fe04ecfeb462c Apr 17 20:04:14.896479 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.896458 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeab5cda_2d6b_4fd2_96fe_b2cb300b80fd.slice/crio-1edcf230f96e3bfabc29c4e484db5edbca41cfaf2125d2b4faaab3d520b24385 WatchSource:0}: Error finding container 1edcf230f96e3bfabc29c4e484db5edbca41cfaf2125d2b4faaab3d520b24385: Status 404 returned error can't find the container with id 1edcf230f96e3bfabc29c4e484db5edbca41cfaf2125d2b4faaab3d520b24385 Apr 17 20:04:14.898152 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.897939 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f70982_1fda_48ca_bbf7_530ff3957212.slice/crio-c574ede38979f04cce5def98b516288828ae078b4d30eb08654d7b6a5fa62a2b WatchSource:0}: Error finding container c574ede38979f04cce5def98b516288828ae078b4d30eb08654d7b6a5fa62a2b: Status 404 returned error can't find the container with id c574ede38979f04cce5def98b516288828ae078b4d30eb08654d7b6a5fa62a2b Apr 17 20:04:14.900344 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.900303 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd04fad23_2e01_4696_9db6_11106ea9ec13.slice/crio-9c548ac529ba0f322f91bbeaf264b5538dd709866629c0606b78e4d8977f4b5a WatchSource:0}: Error finding container 9c548ac529ba0f322f91bbeaf264b5538dd709866629c0606b78e4d8977f4b5a: Status 404 returned error can't find the container with id 9c548ac529ba0f322f91bbeaf264b5538dd709866629c0606b78e4d8977f4b5a Apr 17 20:04:14.901170 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.901153 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87a4ec80_629d_4a66_8195_8fe5d60b43f9.slice/crio-f8f691e9923dc351c926a41c96308e0631ee5a1231079e20cac40201c9d0938b WatchSource:0}: Error finding container f8f691e9923dc351c926a41c96308e0631ee5a1231079e20cac40201c9d0938b: Status 404 returned error can't find the container with id f8f691e9923dc351c926a41c96308e0631ee5a1231079e20cac40201c9d0938b Apr 17 20:04:14.902414 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.902378 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod316b73ef_655f_4979_a7b1_dcaf0e3bb3ad.slice/crio-427099815c8ed93b7f7cddb8d4b96f55ff4519b853ad681bd2135153b2aaaca1 WatchSource:0}: Error finding container 427099815c8ed93b7f7cddb8d4b96f55ff4519b853ad681bd2135153b2aaaca1: Status 404 returned error can't find the container with id 427099815c8ed93b7f7cddb8d4b96f55ff4519b853ad681bd2135153b2aaaca1 Apr 17 20:04:14.903945 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:14.903921 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7470e9e7_7248_44cf_81a8_fc62c99d05b9.slice/crio-7659d0280175b74c41612e697fb6b46e3354f933f184f0e8845d6b4103e7c8d6 WatchSource:0}: Error finding container 7659d0280175b74c41612e697fb6b46e3354f933f184f0e8845d6b4103e7c8d6: Status 404 returned error can't find the container with id 7659d0280175b74c41612e697fb6b46e3354f933f184f0e8845d6b4103e7c8d6 Apr 17 20:04:15.168503 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.168186 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 19:59:13 +0000 UTC" deadline="2027-10-30 15:06:16.546796333 +0000 UTC" Apr 17 20:04:15.168503 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.168419 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13459h2m1.378406029s" Apr 17 20:04:15.278026 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.277370 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" event={"ID":"6e89466f546d82f5fc8e46ec06064587","Type":"ContainerStarted","Data":"818a648b59ca973d41ec60977c19c43e295f0823200b999509da8f6ae4d1f0b8"} Apr 17 20:04:15.281940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.281146 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9d497" event={"ID":"87a4ec80-629d-4a66-8195-8fe5d60b43f9","Type":"ContainerStarted","Data":"f8f691e9923dc351c926a41c96308e0631ee5a1231079e20cac40201c9d0938b"} Apr 17 20:04:15.285120 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.285089 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gkjdr" event={"ID":"7470e9e7-7248-44cf-81a8-fc62c99d05b9","Type":"ContainerStarted","Data":"7659d0280175b74c41612e697fb6b46e3354f933f184f0e8845d6b4103e7c8d6"} Apr 17 20:04:15.290124 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.290073 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-88qdj" event={"ID":"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad","Type":"ContainerStarted","Data":"427099815c8ed93b7f7cddb8d4b96f55ff4519b853ad681bd2135153b2aaaca1"} Apr 17 20:04:15.299709 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.299634 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" event={"ID":"d04fad23-2e01-4696-9db6-11106ea9ec13","Type":"ContainerStarted","Data":"9c548ac529ba0f322f91bbeaf264b5538dd709866629c0606b78e4d8977f4b5a"} Apr 17 20:04:15.303006 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.302918 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"c574ede38979f04cce5def98b516288828ae078b4d30eb08654d7b6a5fa62a2b"} Apr 17 20:04:15.306039 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.306008 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" event={"ID":"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd","Type":"ContainerStarted","Data":"1edcf230f96e3bfabc29c4e484db5edbca41cfaf2125d2b4faaab3d520b24385"} Apr 17 20:04:15.310869 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.310838 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6bfbp" event={"ID":"a591f534-0100-4238-b0cc-81835de74e25","Type":"ContainerStarted","Data":"4c59e372f8681d8dd131b3ce2534d8571d6528996bda3d26ed1fe04ecfeb462c"} Apr 17 20:04:15.318341 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.318307 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rxx56" event={"ID":"648545b8-d5e2-4491-9d4d-e78f3052aefb","Type":"ContainerStarted","Data":"f841b6a9d25254cb1af912a1bb8c4929f21ebb7a171bde3839e435ef3abfe142"} Apr 17 20:04:15.323169 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.323097 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerStarted","Data":"2298ed8f1bcaac9274e3023e5c38b63743f776fae5c6a9a8dc14276dcc0b3222"} Apr 17 20:04:15.761157 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.761107 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:15.761326 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:15.761281 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:15.761419 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:15.761359 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs podName:a3207e4f-83f5-4913-a57e-c29dd6aed2df nodeName:}" failed. No retries permitted until 2026-04-17 20:04:17.761338899 +0000 UTC m=+6.101493231 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs") pod "network-metrics-daemon-2ctfd" (UID: "a3207e4f-83f5-4913-a57e-c29dd6aed2df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:15.862476 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:15.861703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:15.862476 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:15.861934 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:15.862476 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:15.861957 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:15.862476 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:15.861969 2568 projected.go:194] Error preparing data for projected volume kube-api-access-98jpl for pod openshift-network-diagnostics/network-check-target-x94f6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:15.862476 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:15.862070 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl podName:9cf5bd58-f267-46f8-9af8-24426ecf56e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:17.862050011 +0000 UTC m=+6.202204343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-98jpl" (UniqueName: "kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl") pod "network-check-target-x94f6" (UID: "9cf5bd58-f267-46f8-9af8-24426ecf56e0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:16.265507 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:16.265477 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:16.265957 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:16.265611 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:16.266091 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:16.266024 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:16.266158 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:16.266111 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:16.343956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:16.343856 2568 generic.go:358] "Generic (PLEG): container finished" podID="1fd0626565d1684df72afda7787b2e7f" containerID="718483ea9087233f63cae02de9a7cb9c0b7eff10b2b07506ade7993a72a44f9a" exitCode=0 Apr 17 20:04:16.345059 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:16.344801 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" event={"ID":"1fd0626565d1684df72afda7787b2e7f","Type":"ContainerDied","Data":"718483ea9087233f63cae02de9a7cb9c0b7eff10b2b07506ade7993a72a44f9a"} Apr 17 20:04:16.360031 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:16.358720 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-159.ec2.internal" podStartSLOduration=3.358697139 podStartE2EDuration="3.358697139s" podCreationTimestamp="2026-04-17 20:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:04:15.293274168 +0000 UTC m=+3.633428521" watchObservedRunningTime="2026-04-17 20:04:16.358697139 +0000 UTC m=+4.698851491" Apr 17 20:04:17.356969 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:17.356929 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" event={"ID":"1fd0626565d1684df72afda7787b2e7f","Type":"ContainerStarted","Data":"b4f58516a110e4c899e662334c19922a42b109bca3113b9c69b648b007fae3e1"} Apr 17 20:04:17.371451 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:17.371370 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-159.ec2.internal" podStartSLOduration=4.371348478 podStartE2EDuration="4.371348478s" podCreationTimestamp="2026-04-17 20:04:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:04:17.370846881 +0000 UTC m=+5.711001246" watchObservedRunningTime="2026-04-17 20:04:17.371348478 +0000 UTC m=+5.711502830" Apr 17 20:04:17.781089 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:17.781008 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:17.781300 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:17.781146 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:17.781300 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:17.781209 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs podName:a3207e4f-83f5-4913-a57e-c29dd6aed2df nodeName:}" failed. No retries permitted until 2026-04-17 20:04:21.781190342 +0000 UTC m=+10.121344694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs") pod "network-metrics-daemon-2ctfd" (UID: "a3207e4f-83f5-4913-a57e-c29dd6aed2df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:17.881410 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:17.881350 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:17.881606 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:17.881540 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:17.881606 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:17.881565 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:17.881606 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:17.881579 2568 projected.go:194] Error preparing data for projected volume kube-api-access-98jpl for pod openshift-network-diagnostics/network-check-target-x94f6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:17.881748 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:17.881638 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl podName:9cf5bd58-f267-46f8-9af8-24426ecf56e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:21.881619552 +0000 UTC m=+10.221773900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-98jpl" (UniqueName: "kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl") pod "network-check-target-x94f6" (UID: "9cf5bd58-f267-46f8-9af8-24426ecf56e0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:18.262745 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:18.262710 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:18.262915 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:18.262859 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:18.263212 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:18.262707 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:18.263382 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:18.263290 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:20.263650 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:20.263615 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:20.264244 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:20.264209 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:20.264576 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:20.264558 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:20.264749 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:20.264725 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:21.816475 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:21.815833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:21.816475 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:21.816041 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:21.816475 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:21.816112 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs podName:a3207e4f-83f5-4913-a57e-c29dd6aed2df nodeName:}" failed. No retries permitted until 2026-04-17 20:04:29.816091523 +0000 UTC m=+18.156245854 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs") pod "network-metrics-daemon-2ctfd" (UID: "a3207e4f-83f5-4913-a57e-c29dd6aed2df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:21.818103 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:21.817834 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hnjc9"] Apr 17 20:04:21.821002 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:21.820970 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:21.821146 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:21.821054 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:21.917220 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:21.916433 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:21.917220 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:21.916540 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:21.917220 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:21.916602 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-kubelet-config\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:21.917220 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:21.916627 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-dbus\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:21.917220 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:21.916809 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:21.917220 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:21.916827 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:21.917220 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:21.916839 2568 projected.go:194] Error preparing data for projected volume kube-api-access-98jpl for pod openshift-network-diagnostics/network-check-target-x94f6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:21.917220 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:21.916893 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl podName:9cf5bd58-f267-46f8-9af8-24426ecf56e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:29.916874565 +0000 UTC m=+18.257028897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-98jpl" (UniqueName: "kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl") pod "network-check-target-x94f6" (UID: "9cf5bd58-f267-46f8-9af8-24426ecf56e0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:22.017818 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:22.017768 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-kubelet-config\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:22.018002 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:22.017829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-dbus\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:22.018002 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:22.017888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:22.018002 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:22.017988 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-kubelet-config\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:22.018147 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:22.018044 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:22.018147 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:22.018106 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret podName:63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc nodeName:}" failed. No retries permitted until 2026-04-17 20:04:22.518087109 +0000 UTC m=+10.858241454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret") pod "global-pull-secret-syncer-hnjc9" (UID: "63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:22.018249 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:22.018177 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-dbus\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:22.263683 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:22.263652 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:22.263857 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:22.263756 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:22.263857 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:22.263808 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:22.264046 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:22.263938 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:22.523179 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:22.523085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:22.523341 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:22.523221 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:22.523341 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:22.523287 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret podName:63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc nodeName:}" failed. No retries permitted until 2026-04-17 20:04:23.523265902 +0000 UTC m=+11.863420233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret") pod "global-pull-secret-syncer-hnjc9" (UID: "63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:23.263545 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:23.263507 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:23.263987 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:23.263749 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:23.530897 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:23.530797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:23.531064 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:23.530945 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:23.531064 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:23.531015 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret podName:63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc nodeName:}" failed. No retries permitted until 2026-04-17 20:04:25.530995258 +0000 UTC m=+13.871149597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret") pod "global-pull-secret-syncer-hnjc9" (UID: "63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:24.263719 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:24.263621 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:24.263719 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:24.263663 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:24.264176 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:24.263757 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:24.264176 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:24.263896 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:25.263546 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:25.263506 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:25.263741 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:25.263644 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:25.546098 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:25.546012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:25.546253 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:25.546157 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:25.546253 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:25.546227 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret podName:63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc nodeName:}" failed. No retries permitted until 2026-04-17 20:04:29.546208243 +0000 UTC m=+17.886362573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret") pod "global-pull-secret-syncer-hnjc9" (UID: "63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:26.263347 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:26.263296 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:26.263550 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:26.263302 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:26.263550 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:26.263460 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:26.263550 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:26.263527 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:27.262611 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:27.262571 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:27.263041 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:27.262683 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:28.262760 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:28.262719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:28.263204 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:28.262731 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:28.263204 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:28.262887 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:28.263204 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:28.262948 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:29.263314 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:29.263276 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:29.263840 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.263428 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:29.577903 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:29.577797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:29.578072 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.577938 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:29.578072 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.578026 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret podName:63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc nodeName:}" failed. No retries permitted until 2026-04-17 20:04:37.578003498 +0000 UTC m=+25.918157830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret") pod "global-pull-secret-syncer-hnjc9" (UID: "63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:29.879740 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:29.879647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:29.879914 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.879806 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:29.879914 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.879897 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs podName:a3207e4f-83f5-4913-a57e-c29dd6aed2df nodeName:}" failed. No retries permitted until 2026-04-17 20:04:45.879874887 +0000 UTC m=+34.220029216 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs") pod "network-metrics-daemon-2ctfd" (UID: "a3207e4f-83f5-4913-a57e-c29dd6aed2df") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:04:29.980800 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:29.980754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:29.980971 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.980928 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:04:29.980971 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.980949 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:04:29.980971 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.980959 2568 projected.go:194] Error preparing data for projected volume kube-api-access-98jpl for pod openshift-network-diagnostics/network-check-target-x94f6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:29.981087 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:29.981022 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl podName:9cf5bd58-f267-46f8-9af8-24426ecf56e0 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:45.98100012 +0000 UTC m=+34.321154470 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-98jpl" (UniqueName: "kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl") pod "network-check-target-x94f6" (UID: "9cf5bd58-f267-46f8-9af8-24426ecf56e0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:04:30.263601 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:30.263566 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:30.264016 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:30.263678 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:30.264016 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:30.263734 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:30.264016 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:30.263824 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:31.263165 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:31.263125 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:31.263358 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:31.263260 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:32.264214 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.263922 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:32.264689 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:32.264259 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:32.264689 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.264067 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:32.264689 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:32.264342 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:32.384332 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.384296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gkjdr" event={"ID":"7470e9e7-7248-44cf-81a8-fc62c99d05b9","Type":"ContainerStarted","Data":"34985de937d5b2c46e1cf744d592ab0f424695162abe9511cb5f667a2bdc9c08"} Apr 17 20:04:32.385522 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.385499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-88qdj" event={"ID":"316b73ef-655f-4979-a7b1-dcaf0e3bb3ad","Type":"ContainerStarted","Data":"6a3183af41f9460faca0ae361a38a9f6100d23902e492c9a4b315c70ff64b8e7"} Apr 17 20:04:32.386655 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.386636 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" event={"ID":"d04fad23-2e01-4696-9db6-11106ea9ec13","Type":"ContainerStarted","Data":"eb4aa38e691c26fa9b829ebd2104cfad4d6fa46b0f4c78dec77f3ef7f2292ad3"} Apr 17 20:04:32.388082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.388064 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:04:32.388354 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.388337 2568 generic.go:358] "Generic (PLEG): container finished" podID="32f70982-1fda-48ca-bbf7-530ff3957212" containerID="4b64f90acbba4528f9b80772592fb5748ddc783e43867b5dc17138987802e047" exitCode=1 Apr 17 20:04:32.388428 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.388389 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"b96ddd50bf4813ad0ef8b568afe2271fcfa0ff7da57cfdce3e8f52b07b44b16e"} Apr 17 20:04:32.388428 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.388424 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerDied","Data":"4b64f90acbba4528f9b80772592fb5748ddc783e43867b5dc17138987802e047"} Apr 17 20:04:32.388524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.388435 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"2cfdc99e4248b86c0589b4e3ae70d6a13a8581d8034c46db41503a965280650e"} Apr 17 20:04:32.389599 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.389580 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" event={"ID":"eeab5cda-2d6b-4fd2-96fe-b2cb300b80fd","Type":"ContainerStarted","Data":"68044f6a9338e64686a3ccfb738e3ab60d3ba9c4fb948032317ca1de018dd643"} Apr 17 20:04:32.390781 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.390763 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6bfbp" event={"ID":"a591f534-0100-4238-b0cc-81835de74e25","Type":"ContainerStarted","Data":"362fb4418f4488db6fcc74ba60486454f831d658f96f0282ea282149d3be411b"} Apr 17 20:04:32.391816 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.391797 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rxx56" event={"ID":"648545b8-d5e2-4491-9d4d-e78f3052aefb","Type":"ContainerStarted","Data":"417bede515dffd20e5fa2906c646d76d7556b62998ef914b42f7ef628eb11b90"} Apr 17 20:04:32.392998 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.392979 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerStarted","Data":"c8b1c1f5b1eb277e349147412cb5faec098a7fafe8ccdbb08cdd34a87dced715"} Apr 17 20:04:32.397860 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.397826 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gkjdr" podStartSLOduration=3.337991444 podStartE2EDuration="20.397816684s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.907505037 +0000 UTC m=+3.247659374" lastFinishedPulling="2026-04-17 20:04:31.967330268 +0000 UTC m=+20.307484614" observedRunningTime="2026-04-17 20:04:32.39763648 +0000 UTC m=+20.737790842" watchObservedRunningTime="2026-04-17 20:04:32.397816684 +0000 UTC m=+20.737971035" Apr 17 20:04:32.429911 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.429789 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ztjgz" podStartSLOduration=3.359270176 podStartE2EDuration="20.429773437s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.898325572 +0000 UTC m=+3.238479902" lastFinishedPulling="2026-04-17 20:04:31.968828829 +0000 UTC m=+20.308983163" observedRunningTime="2026-04-17 20:04:32.411688308 +0000 UTC m=+20.751842660" watchObservedRunningTime="2026-04-17 20:04:32.429773437 +0000 UTC m=+20.769927771" Apr 17 20:04:32.454664 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.454611 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6bfbp" podStartSLOduration=11.396259003 podStartE2EDuration="20.454591238s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.897268952 +0000 UTC m=+3.237423287" lastFinishedPulling="2026-04-17 20:04:23.955601191 +0000 UTC m=+12.295755522" observedRunningTime="2026-04-17 20:04:32.442111104 +0000 UTC m=+20.782265456" watchObservedRunningTime="2026-04-17 20:04:32.454591238 +0000 UTC m=+20.794745590" Apr 17 20:04:32.455166 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.455139 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-88qdj" podStartSLOduration=3.39357775 podStartE2EDuration="20.455131519s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.905809904 +0000 UTC m=+3.245964236" lastFinishedPulling="2026-04-17 20:04:31.967363665 +0000 UTC m=+20.307518005" observedRunningTime="2026-04-17 20:04:32.454448093 +0000 UTC m=+20.794602447" watchObservedRunningTime="2026-04-17 20:04:32.455131519 +0000 UTC m=+20.795285867" Apr 17 20:04:32.477471 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:32.477418 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rxx56" podStartSLOduration=3.235703764 podStartE2EDuration="20.477386085s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.896515631 +0000 UTC m=+3.236669967" lastFinishedPulling="2026-04-17 20:04:32.138197946 +0000 UTC m=+20.478352288" observedRunningTime="2026-04-17 20:04:32.476845488 +0000 UTC m=+20.816999840" watchObservedRunningTime="2026-04-17 20:04:32.477386085 +0000 UTC m=+20.817540435" Apr 17 20:04:33.262923 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.262903 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:33.263020 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:33.263000 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:33.276133 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.276112 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:04:33.395600 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.395560 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-9d497" event={"ID":"87a4ec80-629d-4a66-8195-8fe5d60b43f9","Type":"ContainerStarted","Data":"43e6294772303044f866d6ef19400801f582618a755f7b7f016ad56fe29ff5a4"} Apr 17 20:04:33.397051 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.397024 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" event={"ID":"d04fad23-2e01-4696-9db6-11106ea9ec13","Type":"ContainerStarted","Data":"a5dbe67f0b88771ef2a178d2b33848f2baa5c2fda19ce794db1db8e94590c69f"} Apr 17 20:04:33.399254 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.399232 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:04:33.399628 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.399606 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"80dfd41db38378f32c86ce040c74bd78824988570506788632efa07026e7bcf1"} Apr 17 20:04:33.399713 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.399636 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"b5c12e983767fe5ceb35486e90fefa15d2b0975152fad0bce1ca1e78116348ba"} Apr 17 20:04:33.399713 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.399651 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"b1e368aa39edee986528151065a04c2ce1b5b9aa8932e2a763796140240a346b"} Apr 17 20:04:33.400907 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.400886 2568 generic.go:358] "Generic (PLEG): container finished" podID="8beb86a8-efdf-4bba-8697-baf00c6854af" containerID="c8b1c1f5b1eb277e349147412cb5faec098a7fafe8ccdbb08cdd34a87dced715" exitCode=0 Apr 17 20:04:33.401004 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.400983 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerDied","Data":"c8b1c1f5b1eb277e349147412cb5faec098a7fafe8ccdbb08cdd34a87dced715"} Apr 17 20:04:33.409672 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:33.409634 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-9d497" podStartSLOduration=4.353133239 podStartE2EDuration="21.409620483s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.903010359 +0000 UTC m=+3.243164689" lastFinishedPulling="2026-04-17 20:04:31.959497603 +0000 UTC m=+20.299651933" observedRunningTime="2026-04-17 20:04:33.409482116 +0000 UTC m=+21.749636478" watchObservedRunningTime="2026-04-17 20:04:33.409620483 +0000 UTC m=+21.749774833" Apr 17 20:04:34.203996 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:34.203680 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:04:33.276128719Z","UUID":"cc73cb25-098d-4887-a13b-6eba9c342112","Handler":null,"Name":"","Endpoint":""} Apr 17 20:04:34.212002 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:34.211976 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:04:34.212002 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:34.212004 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:04:34.263087 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:34.263050 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:34.263282 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:34.263050 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:34.263282 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:34.263195 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:34.263417 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:34.263274 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:34.405230 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:34.405147 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" event={"ID":"d04fad23-2e01-4696-9db6-11106ea9ec13","Type":"ContainerStarted","Data":"f99a33a26f1629be47f44ae5ac0e8567029e77634b295950efede65aa4f99d09"} Apr 17 20:04:34.422313 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:34.422256 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6cdwp" podStartSLOduration=3.246880385 podStartE2EDuration="22.422237775s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.902213407 +0000 UTC m=+3.242367739" lastFinishedPulling="2026-04-17 20:04:34.077570797 +0000 UTC m=+22.417725129" observedRunningTime="2026-04-17 20:04:34.421294591 +0000 UTC m=+22.761448942" watchObservedRunningTime="2026-04-17 20:04:34.422237775 +0000 UTC m=+22.762392127" Apr 17 20:04:35.262821 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:35.262783 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:35.262983 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:35.262926 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:35.410348 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:35.410321 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:04:35.410781 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:35.410729 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"3833ac6d5529c213e964176dec4f8aace82596553b914e7367ad23e23da998a8"} Apr 17 20:04:36.263064 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:36.263029 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:36.263312 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:36.263079 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:36.263312 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:36.263172 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:36.263446 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:36.263309 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:36.596160 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:36.596075 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:36.596910 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:36.596886 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:37.262599 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.262570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:37.262770 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:37.262702 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:37.421951 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.421610 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:04:37.422312 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.422277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"ae7c4dc30f10325b74df7936d8ffccd81b729eb71c1325f97980db936580e98d"} Apr 17 20:04:37.422757 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.422691 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:37.422757 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.422723 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:37.422757 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.422851 2568 scope.go:117] "RemoveContainer" containerID="4b64f90acbba4528f9b80772592fb5748ddc783e43867b5dc17138987802e047" Apr 17 20:04:37.424200 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.424166 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerStarted","Data":"97d79b4bdbb2c1c0a50bfad333f45e35c7f3178c6d744d450ab436257e2698a8"} Apr 17 20:04:37.425004 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.424980 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:37.425877 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.425844 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6bfbp" Apr 17 20:04:37.439773 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.439749 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:37.636939 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:37.636900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:37.637653 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:37.637063 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:37.637653 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:37.637153 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret podName:63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc nodeName:}" failed. No retries permitted until 2026-04-17 20:04:53.637133646 +0000 UTC m=+41.977287974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret") pod "global-pull-secret-syncer-hnjc9" (UID: "63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc") : object "kube-system"/"original-pull-secret" not registered Apr 17 20:04:38.263298 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.263255 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:38.263298 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.263299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:38.263536 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:38.263371 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:38.263536 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:38.263479 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:38.427374 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.427340 2568 generic.go:358] "Generic (PLEG): container finished" podID="8beb86a8-efdf-4bba-8697-baf00c6854af" containerID="97d79b4bdbb2c1c0a50bfad333f45e35c7f3178c6d744d450ab436257e2698a8" exitCode=0 Apr 17 20:04:38.427546 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.427459 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerDied","Data":"97d79b4bdbb2c1c0a50bfad333f45e35c7f3178c6d744d450ab436257e2698a8"} Apr 17 20:04:38.430861 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.430836 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:04:38.431284 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.431254 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" event={"ID":"32f70982-1fda-48ca-bbf7-530ff3957212","Type":"ContainerStarted","Data":"a0ccf2e933d98ab3cb430563f7dba7be452eee81dee37a47b2718812229551cc"} Apr 17 20:04:38.431644 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.431585 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:38.452292 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.452253 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:04:38.476282 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:38.476181 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" podStartSLOduration=9.350024026 podStartE2EDuration="26.476166426s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.90002512 +0000 UTC m=+3.240179449" lastFinishedPulling="2026-04-17 20:04:32.026167514 +0000 UTC m=+20.366321849" observedRunningTime="2026-04-17 20:04:38.475678295 +0000 UTC m=+26.815832667" watchObservedRunningTime="2026-04-17 20:04:38.476166426 +0000 UTC m=+26.816320776" Apr 17 20:04:39.208132 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:39.207924 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hnjc9"] Apr 17 20:04:39.208508 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:39.208243 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:39.208508 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:39.208336 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:39.210495 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:39.210459 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2ctfd"] Apr 17 20:04:39.210648 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:39.210589 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:39.210708 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:39.210678 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:39.211036 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:39.211015 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x94f6"] Apr 17 20:04:39.211136 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:39.211123 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:39.211242 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:39.211220 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:39.434607 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:39.434575 2568 generic.go:358] "Generic (PLEG): container finished" podID="8beb86a8-efdf-4bba-8697-baf00c6854af" containerID="91946aae26bd59bd4a9cc31c6c8d0896405bee004e15ba5750dab78f72c523ff" exitCode=0 Apr 17 20:04:39.434777 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:39.434659 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerDied","Data":"91946aae26bd59bd4a9cc31c6c8d0896405bee004e15ba5750dab78f72c523ff"} Apr 17 20:04:40.440679 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:40.440644 2568 generic.go:358] "Generic (PLEG): container finished" podID="8beb86a8-efdf-4bba-8697-baf00c6854af" containerID="c54ab443f82ef8986d4288baeda3b1a09363d4fcda3e86e8cac0cc2116447ee8" exitCode=0 Apr 17 20:04:40.441124 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:40.440728 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerDied","Data":"c54ab443f82ef8986d4288baeda3b1a09363d4fcda3e86e8cac0cc2116447ee8"} Apr 17 20:04:41.263670 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:41.263629 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:41.263820 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:41.263771 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:41.263903 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:41.263886 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:41.264010 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:41.263985 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:41.264094 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:41.264066 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:41.264148 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:41.264137 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:43.263137 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:43.263108 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:43.263137 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:43.263122 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:43.263844 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:43.263108 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:43.263844 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:43.263229 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hnjc9" podUID="63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc" Apr 17 20:04:43.263844 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:43.263315 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ctfd" podUID="a3207e4f-83f5-4913-a57e-c29dd6aed2df" Apr 17 20:04:43.263844 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:43.263413 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x94f6" podUID="9cf5bd58-f267-46f8-9af8-24426ecf56e0" Apr 17 20:04:43.962521 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:43.962477 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-159.ec2.internal" event="NodeReady" Apr 17 20:04:43.962707 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:43.962611 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:04:44.007077 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.006984 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nrnx5"] Apr 17 20:04:44.038638 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.038548 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qpnwl"] Apr 17 20:04:44.038796 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.038696 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.041246 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.041220 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:04:44.041246 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.041245 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:04:44.041483 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.041389 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4cbzc\"" Apr 17 20:04:44.056919 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.056892 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nrnx5"] Apr 17 20:04:44.056919 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.056920 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qpnwl"] Apr 17 20:04:44.057078 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.057044 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:44.059352 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.059328 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:04:44.059525 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.059503 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:04:44.059602 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.059508 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tl4j2\"" Apr 17 20:04:44.059786 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.059766 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:04:44.186654 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.186615 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea635e92-8024-48e9-9b19-6fbeddfe380a-config-volume\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.186838 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.186710 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzhss\" (UniqueName: \"kubernetes.io/projected/ea635e92-8024-48e9-9b19-6fbeddfe380a-kube-api-access-zzhss\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.186838 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.186736 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfnnx\" (UniqueName: \"kubernetes.io/projected/37414adb-2a0d-4af9-93ad-64cc2ea178e7-kube-api-access-rfnnx\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:44.186838 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.186782 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ea635e92-8024-48e9-9b19-6fbeddfe380a-tmp-dir\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.186838 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.186806 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:44.186838 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.186830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.287880 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.287785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfnnx\" (UniqueName: \"kubernetes.io/projected/37414adb-2a0d-4af9-93ad-64cc2ea178e7-kube-api-access-rfnnx\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:44.287880 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.287839 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ea635e92-8024-48e9-9b19-6fbeddfe380a-tmp-dir\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.287880 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.287866 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.287889 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.287925 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea635e92-8024-48e9-9b19-6fbeddfe380a-config-volume\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:44.288010 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:44.288037 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:44.288078 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert podName:37414adb-2a0d-4af9-93ad-64cc2ea178e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:44.788058253 +0000 UTC m=+33.128212583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert") pod "ingress-canary-qpnwl" (UID: "37414adb-2a0d-4af9-93ad-64cc2ea178e7") : secret "canary-serving-cert" not found Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:44.288095 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls podName:ea635e92-8024-48e9-9b19-6fbeddfe380a nodeName:}" failed. No retries permitted until 2026-04-17 20:04:44.788087021 +0000 UTC m=+33.128241353 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls") pod "dns-default-nrnx5" (UID: "ea635e92-8024-48e9-9b19-6fbeddfe380a") : secret "dns-default-metrics-tls" not found Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.288117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzhss\" (UniqueName: \"kubernetes.io/projected/ea635e92-8024-48e9-9b19-6fbeddfe380a-kube-api-access-zzhss\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.288302 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ea635e92-8024-48e9-9b19-6fbeddfe380a-tmp-dir\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.288630 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.288578 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ea635e92-8024-48e9-9b19-6fbeddfe380a-config-volume\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.299214 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.299180 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzhss\" (UniqueName: \"kubernetes.io/projected/ea635e92-8024-48e9-9b19-6fbeddfe380a-kube-api-access-zzhss\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.299435 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.299412 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfnnx\" (UniqueName: \"kubernetes.io/projected/37414adb-2a0d-4af9-93ad-64cc2ea178e7-kube-api-access-rfnnx\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:44.792170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.792122 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:44.792379 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:44.792188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:44.792379 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:44.792260 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:44.792379 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:44.792341 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert podName:37414adb-2a0d-4af9-93ad-64cc2ea178e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:45.792319843 +0000 UTC m=+34.132474195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert") pod "ingress-canary-qpnwl" (UID: "37414adb-2a0d-4af9-93ad-64cc2ea178e7") : secret "canary-serving-cert" not found Apr 17 20:04:44.792379 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:44.792362 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:44.792578 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:44.792496 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls podName:ea635e92-8024-48e9-9b19-6fbeddfe380a nodeName:}" failed. No retries permitted until 2026-04-17 20:04:45.792478222 +0000 UTC m=+34.132632570 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls") pod "dns-default-nrnx5" (UID: "ea635e92-8024-48e9-9b19-6fbeddfe380a") : secret "dns-default-metrics-tls" not found Apr 17 20:04:45.262725 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.262674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:45.262936 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.262673 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:45.263008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.262673 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:45.266672 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.266643 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:04:45.266823 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.266691 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x8jh2\"" Apr 17 20:04:45.266894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.266843 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:04:45.266894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.266865 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:04:45.267002 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.266918 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:04:45.267185 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.267168 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8c6pz\"" Apr 17 20:04:45.799228 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.799185 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:45.799772 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.799243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:45.799772 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:45.799356 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:45.799772 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:45.799360 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:45.799772 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:45.799455 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls podName:ea635e92-8024-48e9-9b19-6fbeddfe380a nodeName:}" failed. No retries permitted until 2026-04-17 20:04:47.799439149 +0000 UTC m=+36.139593477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls") pod "dns-default-nrnx5" (UID: "ea635e92-8024-48e9-9b19-6fbeddfe380a") : secret "dns-default-metrics-tls" not found Apr 17 20:04:45.799772 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:45.799467 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert podName:37414adb-2a0d-4af9-93ad-64cc2ea178e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:47.799461488 +0000 UTC m=+36.139615817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert") pod "ingress-canary-qpnwl" (UID: "37414adb-2a0d-4af9-93ad-64cc2ea178e7") : secret "canary-serving-cert" not found Apr 17 20:04:45.900165 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:45.900112 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:04:45.900362 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:45.900282 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:04:45.900459 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:45.900367 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs podName:a3207e4f-83f5-4913-a57e-c29dd6aed2df nodeName:}" failed. No retries permitted until 2026-04-17 20:05:17.900342721 +0000 UTC m=+66.240497053 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs") pod "network-metrics-daemon-2ctfd" (UID: "a3207e4f-83f5-4913-a57e-c29dd6aed2df") : secret "metrics-daemon-secret" not found Apr 17 20:04:46.000915 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:46.000871 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:46.004021 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:46.003994 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98jpl\" (UniqueName: \"kubernetes.io/projected/9cf5bd58-f267-46f8-9af8-24426ecf56e0-kube-api-access-98jpl\") pod \"network-check-target-x94f6\" (UID: \"9cf5bd58-f267-46f8-9af8-24426ecf56e0\") " pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:46.187942 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:46.187728 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:46.341453 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:46.341421 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x94f6"] Apr 17 20:04:46.379825 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:46.379784 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cf5bd58_f267_46f8_9af8_24426ecf56e0.slice/crio-dd1445f347c42f8bcd0b40b58ffa7960e21ca04494302b3ac5e4eb6ba4c7697d WatchSource:0}: Error finding container dd1445f347c42f8bcd0b40b58ffa7960e21ca04494302b3ac5e4eb6ba4c7697d: Status 404 returned error can't find the container with id dd1445f347c42f8bcd0b40b58ffa7960e21ca04494302b3ac5e4eb6ba4c7697d Apr 17 20:04:46.452879 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:46.452765 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x94f6" event={"ID":"9cf5bd58-f267-46f8-9af8-24426ecf56e0","Type":"ContainerStarted","Data":"dd1445f347c42f8bcd0b40b58ffa7960e21ca04494302b3ac5e4eb6ba4c7697d"} Apr 17 20:04:47.458238 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:47.458193 2568 generic.go:358] "Generic (PLEG): container finished" podID="8beb86a8-efdf-4bba-8697-baf00c6854af" containerID="51270d0fb905155169dcaf32e6ddece3fe9a1a25ddd793e54a385aa942ddd52d" exitCode=0 Apr 17 20:04:47.458759 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:47.458277 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerDied","Data":"51270d0fb905155169dcaf32e6ddece3fe9a1a25ddd793e54a385aa942ddd52d"} Apr 17 20:04:47.816878 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:47.816784 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:47.816878 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:47.816836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:47.817132 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:47.816948 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:47.817132 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:47.816973 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:47.817132 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:47.817030 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls podName:ea635e92-8024-48e9-9b19-6fbeddfe380a nodeName:}" failed. No retries permitted until 2026-04-17 20:04:51.817010744 +0000 UTC m=+40.157165076 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls") pod "dns-default-nrnx5" (UID: "ea635e92-8024-48e9-9b19-6fbeddfe380a") : secret "dns-default-metrics-tls" not found Apr 17 20:04:47.817132 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:47.817049 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert podName:37414adb-2a0d-4af9-93ad-64cc2ea178e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:51.817040209 +0000 UTC m=+40.157194543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert") pod "ingress-canary-qpnwl" (UID: "37414adb-2a0d-4af9-93ad-64cc2ea178e7") : secret "canary-serving-cert" not found Apr 17 20:04:48.463318 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:48.463219 2568 generic.go:358] "Generic (PLEG): container finished" podID="8beb86a8-efdf-4bba-8697-baf00c6854af" containerID="150f5783ad94d6fec9b63659c102e9c20772627fa96f1d5fe8ed2b24845ee84b" exitCode=0 Apr 17 20:04:48.463318 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:48.463273 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerDied","Data":"150f5783ad94d6fec9b63659c102e9c20772627fa96f1d5fe8ed2b24845ee84b"} Apr 17 20:04:49.468369 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:49.468126 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-245q6" event={"ID":"8beb86a8-efdf-4bba-8697-baf00c6854af","Type":"ContainerStarted","Data":"c803b92e812c3587801c3426ba358c2194001a5d9f2d3c0bb65368838021d775"} Apr 17 20:04:49.469462 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:49.469434 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x94f6" event={"ID":"9cf5bd58-f267-46f8-9af8-24426ecf56e0","Type":"ContainerStarted","Data":"62e5474e665320edc02c19366683b0e23ddbc20a5494dbbd2e61fcfc79646a21"} Apr 17 20:04:49.469603 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:49.469584 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:04:49.488708 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:49.488662 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-245q6" podStartSLOduration=5.97503564 podStartE2EDuration="37.48864889s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:14.894846596 +0000 UTC m=+3.235000928" lastFinishedPulling="2026-04-17 20:04:46.408459845 +0000 UTC m=+34.748614178" observedRunningTime="2026-04-17 20:04:49.48752304 +0000 UTC m=+37.827677390" watchObservedRunningTime="2026-04-17 20:04:49.48864889 +0000 UTC m=+37.828803241" Apr 17 20:04:49.501115 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:49.500978 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-x94f6" podStartSLOduration=34.558583777 podStartE2EDuration="37.500961902s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:04:46.385773256 +0000 UTC m=+34.725927586" lastFinishedPulling="2026-04-17 20:04:49.328151381 +0000 UTC m=+37.668305711" observedRunningTime="2026-04-17 20:04:49.50059259 +0000 UTC m=+37.840746941" watchObservedRunningTime="2026-04-17 20:04:49.500961902 +0000 UTC m=+37.841116254" Apr 17 20:04:51.847632 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:51.847591 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:51.847632 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:51.847635 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:51.848160 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:51.847742 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:51.848160 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:51.847744 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:51.848160 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:51.847803 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls podName:ea635e92-8024-48e9-9b19-6fbeddfe380a nodeName:}" failed. No retries permitted until 2026-04-17 20:04:59.847787343 +0000 UTC m=+48.187941677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls") pod "dns-default-nrnx5" (UID: "ea635e92-8024-48e9-9b19-6fbeddfe380a") : secret "dns-default-metrics-tls" not found Apr 17 20:04:51.848160 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:51.847815 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert podName:37414adb-2a0d-4af9-93ad-64cc2ea178e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:04:59.847809257 +0000 UTC m=+48.187963586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert") pod "ingress-canary-qpnwl" (UID: "37414adb-2a0d-4af9-93ad-64cc2ea178e7") : secret "canary-serving-cert" not found Apr 17 20:04:53.661509 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:53.661466 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:53.665526 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:53.665501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc-original-pull-secret\") pod \"global-pull-secret-syncer-hnjc9\" (UID: \"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc\") " pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:53.682724 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:53.682696 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hnjc9" Apr 17 20:04:53.814411 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:53.814360 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hnjc9"] Apr 17 20:04:53.822650 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:04:53.822616 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63c6975a_1b4b_43cc_9f02_5ad1b5b8b8cc.slice/crio-39e0437e4664e66a78da7a733b10765a88046cf5114261f9fd3efdc8fc1a5e5d WatchSource:0}: Error finding container 39e0437e4664e66a78da7a733b10765a88046cf5114261f9fd3efdc8fc1a5e5d: Status 404 returned error can't find the container with id 39e0437e4664e66a78da7a733b10765a88046cf5114261f9fd3efdc8fc1a5e5d Apr 17 20:04:54.480000 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:54.479963 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hnjc9" event={"ID":"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc","Type":"ContainerStarted","Data":"39e0437e4664e66a78da7a733b10765a88046cf5114261f9fd3efdc8fc1a5e5d"} Apr 17 20:04:58.489098 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:58.489011 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hnjc9" event={"ID":"63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc","Type":"ContainerStarted","Data":"b69c53bae7b47b09d40217fa010b83f9a7ed40fab0b395e4a76f2439e9c7dc6b"} Apr 17 20:04:58.503376 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:58.503332 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hnjc9" podStartSLOduration=33.131289928 podStartE2EDuration="37.503317282s" podCreationTimestamp="2026-04-17 20:04:21 +0000 UTC" firstStartedPulling="2026-04-17 20:04:53.824446539 +0000 UTC m=+42.164600887" lastFinishedPulling="2026-04-17 20:04:58.196473908 +0000 UTC m=+46.536628241" observedRunningTime="2026-04-17 20:04:58.50312782 +0000 UTC m=+46.843282175" watchObservedRunningTime="2026-04-17 20:04:58.503317282 +0000 UTC m=+46.843471634" Apr 17 20:04:59.908237 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:59.908187 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:04:59.908237 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:04:59.908237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:04:59.908763 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:59.908327 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:04:59.908763 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:59.908374 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:04:59.908763 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:59.908417 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert podName:37414adb-2a0d-4af9-93ad-64cc2ea178e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:15.908377097 +0000 UTC m=+64.248531426 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert") pod "ingress-canary-qpnwl" (UID: "37414adb-2a0d-4af9-93ad-64cc2ea178e7") : secret "canary-serving-cert" not found Apr 17 20:04:59.908763 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:04:59.908468 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls podName:ea635e92-8024-48e9-9b19-6fbeddfe380a nodeName:}" failed. No retries permitted until 2026-04-17 20:05:15.908451433 +0000 UTC m=+64.248605763 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls") pod "dns-default-nrnx5" (UID: "ea635e92-8024-48e9-9b19-6fbeddfe380a") : secret "dns-default-metrics-tls" not found Apr 17 20:05:10.452524 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:10.452491 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jqvld" Apr 17 20:05:15.910434 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:15.910375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:05:15.910434 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:15.910434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:05:15.911034 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:15.910560 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:05:15.911034 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:15.910569 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:05:15.911034 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:15.910630 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls podName:ea635e92-8024-48e9-9b19-6fbeddfe380a nodeName:}" failed. No retries permitted until 2026-04-17 20:05:47.910612283 +0000 UTC m=+96.250766627 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls") pod "dns-default-nrnx5" (UID: "ea635e92-8024-48e9-9b19-6fbeddfe380a") : secret "dns-default-metrics-tls" not found Apr 17 20:05:15.911034 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:15.910644 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert podName:37414adb-2a0d-4af9-93ad-64cc2ea178e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:47.91063825 +0000 UTC m=+96.250792580 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert") pod "ingress-canary-qpnwl" (UID: "37414adb-2a0d-4af9-93ad-64cc2ea178e7") : secret "canary-serving-cert" not found Apr 17 20:05:17.923381 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:17.923336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:05:17.923809 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:17.923499 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:05:17.923809 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:17.923557 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs podName:a3207e4f-83f5-4913-a57e-c29dd6aed2df nodeName:}" failed. No retries permitted until 2026-04-17 20:06:21.92354292 +0000 UTC m=+130.263697248 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs") pod "network-metrics-daemon-2ctfd" (UID: "a3207e4f-83f5-4913-a57e-c29dd6aed2df") : secret "metrics-daemon-secret" not found Apr 17 20:05:20.473305 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:20.473278 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x94f6" Apr 17 20:05:27.728053 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.728022 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-674s7"] Apr 17 20:05:27.766841 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.766801 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-674s7"] Apr 17 20:05:27.767013 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.766934 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.769411 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.769373 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 20:05:27.770302 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.770278 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:05:27.770440 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.770313 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:05:27.770440 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.770357 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 20:05:27.770440 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.770375 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-58vsf\"" Apr 17 20:05:27.776340 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.776313 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 20:05:27.793978 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.793940 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1980268a-97f5-4f06-8170-7ecf507eddf7-service-ca-bundle\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.794122 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.793986 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1980268a-97f5-4f06-8170-7ecf507eddf7-snapshots\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.794122 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.794004 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1980268a-97f5-4f06-8170-7ecf507eddf7-serving-cert\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.794122 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.794080 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1980268a-97f5-4f06-8170-7ecf507eddf7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.794229 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.794143 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1980268a-97f5-4f06-8170-7ecf507eddf7-tmp\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.794229 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.794177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn6sp\" (UniqueName: \"kubernetes.io/projected/1980268a-97f5-4f06-8170-7ecf507eddf7-kube-api-access-fn6sp\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.894603 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.894570 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1980268a-97f5-4f06-8170-7ecf507eddf7-tmp\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.894728 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.894611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fn6sp\" (UniqueName: \"kubernetes.io/projected/1980268a-97f5-4f06-8170-7ecf507eddf7-kube-api-access-fn6sp\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.894728 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.894645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1980268a-97f5-4f06-8170-7ecf507eddf7-service-ca-bundle\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.894728 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.894670 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1980268a-97f5-4f06-8170-7ecf507eddf7-snapshots\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.894728 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.894684 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1980268a-97f5-4f06-8170-7ecf507eddf7-serving-cert\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.894728 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.894726 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1980268a-97f5-4f06-8170-7ecf507eddf7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.895067 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.895042 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1980268a-97f5-4f06-8170-7ecf507eddf7-tmp\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.895315 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.895279 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1980268a-97f5-4f06-8170-7ecf507eddf7-service-ca-bundle\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.895448 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.895320 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1980268a-97f5-4f06-8170-7ecf507eddf7-snapshots\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.895590 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.895573 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1980268a-97f5-4f06-8170-7ecf507eddf7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.897102 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.897070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1980268a-97f5-4f06-8170-7ecf507eddf7-serving-cert\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:27.902569 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:27.902543 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn6sp\" (UniqueName: \"kubernetes.io/projected/1980268a-97f5-4f06-8170-7ecf507eddf7-kube-api-access-fn6sp\") pod \"insights-operator-585dfdc468-674s7\" (UID: \"1980268a-97f5-4f06-8170-7ecf507eddf7\") " pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:28.087809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:28.087713 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-674s7" Apr 17 20:05:28.205749 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:28.205715 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-674s7"] Apr 17 20:05:28.209303 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:05:28.209270 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1980268a_97f5_4f06_8170_7ecf507eddf7.slice/crio-cb262ac2fdbb05801d2d37b6e3cf5abcfcd24cd021e1d57826336022254b8066 WatchSource:0}: Error finding container cb262ac2fdbb05801d2d37b6e3cf5abcfcd24cd021e1d57826336022254b8066: Status 404 returned error can't find the container with id cb262ac2fdbb05801d2d37b6e3cf5abcfcd24cd021e1d57826336022254b8066 Apr 17 20:05:28.548863 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:28.548813 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-674s7" event={"ID":"1980268a-97f5-4f06-8170-7ecf507eddf7","Type":"ContainerStarted","Data":"cb262ac2fdbb05801d2d37b6e3cf5abcfcd24cd021e1d57826336022254b8066"} Apr 17 20:05:30.555068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:30.554942 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-674s7" event={"ID":"1980268a-97f5-4f06-8170-7ecf507eddf7","Type":"ContainerStarted","Data":"c5506394ee94c27ed46a535788feb36de99bf9e818167022a8524006a98c9596"} Apr 17 20:05:30.571195 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:30.571140 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-674s7" podStartSLOduration=1.456030436 podStartE2EDuration="3.571125655s" podCreationTimestamp="2026-04-17 20:05:27 +0000 UTC" firstStartedPulling="2026-04-17 20:05:28.211080337 +0000 UTC m=+76.551234669" lastFinishedPulling="2026-04-17 20:05:30.326175556 +0000 UTC m=+78.666329888" observedRunningTime="2026-04-17 20:05:30.57091017 +0000 UTC m=+78.911064534" watchObservedRunningTime="2026-04-17 20:05:30.571125655 +0000 UTC m=+78.911280006" Apr 17 20:05:33.922478 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:33.922448 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-88qdj_316b73ef-655f-4979-a7b1-dcaf0e3bb3ad/dns-node-resolver/0.log" Apr 17 20:05:35.123534 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:35.123504 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gkjdr_7470e9e7-7248-44cf-81a8-fc62c99d05b9/node-ca/0.log" Apr 17 20:05:36.740177 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.740142 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z"] Apr 17 20:05:36.744211 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.744192 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:36.746630 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.746604 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:05:36.746758 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.746627 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 20:05:36.747483 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.747471 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 20:05:36.747569 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.747550 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-cbfk9\"" Apr 17 20:05:36.750446 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.750424 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z"] Apr 17 20:05:36.861875 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.861832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4x4m\" (UniqueName: \"kubernetes.io/projected/d884fe32-5577-4c53-8e66-8d523b9000c9-kube-api-access-k4x4m\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:36.861875 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.861878 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:36.962521 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.962470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4x4m\" (UniqueName: \"kubernetes.io/projected/d884fe32-5577-4c53-8e66-8d523b9000c9-kube-api-access-k4x4m\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:36.962694 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.962531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:36.962694 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:36.962657 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:05:36.962767 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:36.962731 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls podName:d884fe32-5577-4c53-8e66-8d523b9000c9 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:37.462711914 +0000 UTC m=+85.802866247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lgk8z" (UID: "d884fe32-5577-4c53-8e66-8d523b9000c9") : secret "samples-operator-tls" not found Apr 17 20:05:36.970645 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:36.970623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4x4m\" (UniqueName: \"kubernetes.io/projected/d884fe32-5577-4c53-8e66-8d523b9000c9-kube-api-access-k4x4m\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:37.465993 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.465948 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:37.466166 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:37.466105 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:05:37.466207 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:37.466189 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls podName:d884fe32-5577-4c53-8e66-8d523b9000c9 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:38.466170199 +0000 UTC m=+86.806324529 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lgk8z" (UID: "d884fe32-5577-4c53-8e66-8d523b9000c9") : secret "samples-operator-tls" not found Apr 17 20:05:37.798749 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.798658 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bg9xh"] Apr 17 20:05:37.801619 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.801598 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx"] Apr 17 20:05:37.801769 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.801749 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.803870 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.803839 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 20:05:37.804008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.803980 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 20:05:37.804095 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.804083 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jbmq9\"" Apr 17 20:05:37.804656 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.804634 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 20:05:37.805050 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.805037 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:05:37.805592 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.805575 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.808083 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.808066 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 20:05:37.808194 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.808176 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-xg96f\"" Apr 17 20:05:37.808449 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.808429 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:05:37.808746 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.808672 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 20:05:37.808746 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.808684 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 20:05:37.809930 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.809912 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 20:05:37.811813 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.811769 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx"] Apr 17 20:05:37.812814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.812794 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bg9xh"] Apr 17 20:05:37.869485 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.869448 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c003df9-b811-4f77-9d0a-01312bf9421d-config\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.869485 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.869486 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-config\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.869692 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.869512 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56hf8\" (UniqueName: \"kubernetes.io/projected/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-kube-api-access-56hf8\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.869692 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.869605 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c003df9-b811-4f77-9d0a-01312bf9421d-serving-cert\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.869692 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.869652 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c003df9-b811-4f77-9d0a-01312bf9421d-trusted-ca\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.869692 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.869676 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzkp\" (UniqueName: \"kubernetes.io/projected/5c003df9-b811-4f77-9d0a-01312bf9421d-kube-api-access-ngzkp\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.869804 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.869732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.970929 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.970880 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56hf8\" (UniqueName: \"kubernetes.io/projected/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-kube-api-access-56hf8\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.971130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.970951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c003df9-b811-4f77-9d0a-01312bf9421d-serving-cert\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.971130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.970984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c003df9-b811-4f77-9d0a-01312bf9421d-trusted-ca\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.971130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.971043 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzkp\" (UniqueName: \"kubernetes.io/projected/5c003df9-b811-4f77-9d0a-01312bf9421d-kube-api-access-ngzkp\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.971130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.971111 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.971319 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.971153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c003df9-b811-4f77-9d0a-01312bf9421d-config\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.971319 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.971179 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-config\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.971819 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.971790 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-config\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.971914 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.971802 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c003df9-b811-4f77-9d0a-01312bf9421d-config\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.972009 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.971993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c003df9-b811-4f77-9d0a-01312bf9421d-trusted-ca\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.973344 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.973328 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c003df9-b811-4f77-9d0a-01312bf9421d-serving-cert\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:37.973416 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.973358 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-serving-cert\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.979147 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.979117 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56hf8\" (UniqueName: \"kubernetes.io/projected/fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba-kube-api-access-56hf8\") pod \"service-ca-operator-d6fc45fc5-jbbnx\" (UID: \"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:37.979383 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:37.979366 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzkp\" (UniqueName: \"kubernetes.io/projected/5c003df9-b811-4f77-9d0a-01312bf9421d-kube-api-access-ngzkp\") pod \"console-operator-9d4b6777b-bg9xh\" (UID: \"5c003df9-b811-4f77-9d0a-01312bf9421d\") " pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:38.115149 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:38.115054 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:38.118825 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:38.118802 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" Apr 17 20:05:38.246553 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:38.246515 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bg9xh"] Apr 17 20:05:38.249876 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:05:38.249845 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c003df9_b811_4f77_9d0a_01312bf9421d.slice/crio-b5434a8e2c0ce2d765fa0b0808b766434125959f347e2e45e5a4e0d9dce27da0 WatchSource:0}: Error finding container b5434a8e2c0ce2d765fa0b0808b766434125959f347e2e45e5a4e0d9dce27da0: Status 404 returned error can't find the container with id b5434a8e2c0ce2d765fa0b0808b766434125959f347e2e45e5a4e0d9dce27da0 Apr 17 20:05:38.258384 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:38.258361 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx"] Apr 17 20:05:38.262013 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:05:38.261989 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe56c17d_bf66_46c0_a9c6_5baf8eb3ccba.slice/crio-f6c4dc56af3461ec646fa97ac7229b3635f38a5007fc4830b1b8951909b970f3 WatchSource:0}: Error finding container f6c4dc56af3461ec646fa97ac7229b3635f38a5007fc4830b1b8951909b970f3: Status 404 returned error can't find the container with id f6c4dc56af3461ec646fa97ac7229b3635f38a5007fc4830b1b8951909b970f3 Apr 17 20:05:38.475781 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:38.475742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:38.475964 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:38.475892 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:05:38.475964 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:38.475959 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls podName:d884fe32-5577-4c53-8e66-8d523b9000c9 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:40.475943127 +0000 UTC m=+88.816097456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lgk8z" (UID: "d884fe32-5577-4c53-8e66-8d523b9000c9") : secret "samples-operator-tls" not found Apr 17 20:05:38.570908 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:38.570872 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" event={"ID":"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba","Type":"ContainerStarted","Data":"f6c4dc56af3461ec646fa97ac7229b3635f38a5007fc4830b1b8951909b970f3"} Apr 17 20:05:38.572049 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:38.572017 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" event={"ID":"5c003df9-b811-4f77-9d0a-01312bf9421d","Type":"ContainerStarted","Data":"b5434a8e2c0ce2d765fa0b0808b766434125959f347e2e45e5a4e0d9dce27da0"} Apr 17 20:05:40.489822 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:40.489781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:40.490318 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:40.489897 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:05:40.490318 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:40.489992 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls podName:d884fe32-5577-4c53-8e66-8d523b9000c9 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:44.489966496 +0000 UTC m=+92.830120834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lgk8z" (UID: "d884fe32-5577-4c53-8e66-8d523b9000c9") : secret "samples-operator-tls" not found Apr 17 20:05:40.578465 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:40.578423 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" event={"ID":"5c003df9-b811-4f77-9d0a-01312bf9421d","Type":"ContainerStarted","Data":"03f4b1195a3b17313292854a915ad9e0447af4cbbb1d993f6f2955b4af5218b6"} Apr 17 20:05:40.578717 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:40.578694 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:40.580156 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:40.580122 2568 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-bg9xh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.133.0.10:8443/readyz\": dial tcp 10.133.0.10:8443: connect: connection refused" start-of-body= Apr 17 20:05:40.580281 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:40.580177 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" podUID="5c003df9-b811-4f77-9d0a-01312bf9421d" containerName="console-operator" probeResult="failure" output="Get \"https://10.133.0.10:8443/readyz\": dial tcp 10.133.0.10:8443: connect: connection refused" Apr 17 20:05:40.594367 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:40.594315 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" podStartSLOduration=1.711299698 podStartE2EDuration="3.594292804s" podCreationTimestamp="2026-04-17 20:05:37 +0000 UTC" firstStartedPulling="2026-04-17 20:05:38.251575855 +0000 UTC m=+86.591730184" lastFinishedPulling="2026-04-17 20:05:40.134568957 +0000 UTC m=+88.474723290" observedRunningTime="2026-04-17 20:05:40.593925112 +0000 UTC m=+88.934079462" watchObservedRunningTime="2026-04-17 20:05:40.594292804 +0000 UTC m=+88.934447156" Apr 17 20:05:41.582732 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:41.582695 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" event={"ID":"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba","Type":"ContainerStarted","Data":"9a9cba4527fcb75cb92aebb3106e8cfd7f89a43a17bb1150007ff80a4afa99a0"} Apr 17 20:05:41.584101 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:41.584081 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/0.log" Apr 17 20:05:41.584191 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:41.584117 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c003df9-b811-4f77-9d0a-01312bf9421d" containerID="03f4b1195a3b17313292854a915ad9e0447af4cbbb1d993f6f2955b4af5218b6" exitCode=255 Apr 17 20:05:41.584191 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:41.584147 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" event={"ID":"5c003df9-b811-4f77-9d0a-01312bf9421d","Type":"ContainerDied","Data":"03f4b1195a3b17313292854a915ad9e0447af4cbbb1d993f6f2955b4af5218b6"} Apr 17 20:05:41.584371 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:41.584356 2568 scope.go:117] "RemoveContainer" containerID="03f4b1195a3b17313292854a915ad9e0447af4cbbb1d993f6f2955b4af5218b6" Apr 17 20:05:41.601820 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:41.598901 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" podStartSLOduration=2.162466192 podStartE2EDuration="4.598881618s" podCreationTimestamp="2026-04-17 20:05:37 +0000 UTC" firstStartedPulling="2026-04-17 20:05:38.26405514 +0000 UTC m=+86.604209469" lastFinishedPulling="2026-04-17 20:05:40.700470567 +0000 UTC m=+89.040624895" observedRunningTime="2026-04-17 20:05:41.597218374 +0000 UTC m=+89.937372740" watchObservedRunningTime="2026-04-17 20:05:41.598881618 +0000 UTC m=+89.939035970" Apr 17 20:05:42.588326 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:42.588297 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:05:42.588757 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:42.588724 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/0.log" Apr 17 20:05:42.588801 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:42.588764 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c003df9-b811-4f77-9d0a-01312bf9421d" containerID="31482ba963cefb31ee430a24d81bf6a85d71cba83f83cad3c4770bc227f0b072" exitCode=255 Apr 17 20:05:42.588879 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:42.588853 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" event={"ID":"5c003df9-b811-4f77-9d0a-01312bf9421d","Type":"ContainerDied","Data":"31482ba963cefb31ee430a24d81bf6a85d71cba83f83cad3c4770bc227f0b072"} Apr 17 20:05:42.588958 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:42.588897 2568 scope.go:117] "RemoveContainer" containerID="03f4b1195a3b17313292854a915ad9e0447af4cbbb1d993f6f2955b4af5218b6" Apr 17 20:05:42.589110 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:42.589094 2568 scope.go:117] "RemoveContainer" containerID="31482ba963cefb31ee430a24d81bf6a85d71cba83f83cad3c4770bc227f0b072" Apr 17 20:05:42.589305 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:42.589287 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bg9xh_openshift-console-operator(5c003df9-b811-4f77-9d0a-01312bf9421d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" podUID="5c003df9-b811-4f77-9d0a-01312bf9421d" Apr 17 20:05:43.594827 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:43.594798 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:05:43.595301 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:43.595224 2568 scope.go:117] "RemoveContainer" containerID="31482ba963cefb31ee430a24d81bf6a85d71cba83f83cad3c4770bc227f0b072" Apr 17 20:05:43.595463 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:43.595440 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bg9xh_openshift-console-operator(5c003df9-b811-4f77-9d0a-01312bf9421d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" podUID="5c003df9-b811-4f77-9d0a-01312bf9421d" Apr 17 20:05:44.243495 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.243462 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq"] Apr 17 20:05:44.247659 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.247631 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq" Apr 17 20:05:44.249952 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.249931 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-bvdfd\"" Apr 17 20:05:44.253358 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.253330 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq"] Apr 17 20:05:44.321947 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.321904 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrnzt\" (UniqueName: \"kubernetes.io/projected/f96a9e59-b624-4850-ab85-b3968aa4f8b3-kube-api-access-qrnzt\") pod \"network-check-source-8894fc9bd-q6svq\" (UID: \"f96a9e59-b624-4850-ab85-b3968aa4f8b3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq" Apr 17 20:05:44.422378 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.422345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrnzt\" (UniqueName: \"kubernetes.io/projected/f96a9e59-b624-4850-ab85-b3968aa4f8b3-kube-api-access-qrnzt\") pod \"network-check-source-8894fc9bd-q6svq\" (UID: \"f96a9e59-b624-4850-ab85-b3968aa4f8b3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq" Apr 17 20:05:44.430727 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.430692 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrnzt\" (UniqueName: \"kubernetes.io/projected/f96a9e59-b624-4850-ab85-b3968aa4f8b3-kube-api-access-qrnzt\") pod \"network-check-source-8894fc9bd-q6svq\" (UID: \"f96a9e59-b624-4850-ab85-b3968aa4f8b3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq" Apr 17 20:05:44.523750 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.523651 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:44.523889 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:44.523769 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:05:44.523889 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:44.523848 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls podName:d884fe32-5577-4c53-8e66-8d523b9000c9 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:52.523831651 +0000 UTC m=+100.863985981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lgk8z" (UID: "d884fe32-5577-4c53-8e66-8d523b9000c9") : secret "samples-operator-tls" not found Apr 17 20:05:44.558349 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.558313 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq" Apr 17 20:05:44.672573 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:44.672528 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq"] Apr 17 20:05:44.677064 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:05:44.677024 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf96a9e59_b624_4850_ab85_b3968aa4f8b3.slice/crio-546d9fc8282367a8a9af3870a2fcff9a20b502471e006eb8c86631df57844583 WatchSource:0}: Error finding container 546d9fc8282367a8a9af3870a2fcff9a20b502471e006eb8c86631df57844583: Status 404 returned error can't find the container with id 546d9fc8282367a8a9af3870a2fcff9a20b502471e006eb8c86631df57844583 Apr 17 20:05:45.199900 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.199863 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sw2k4"] Apr 17 20:05:45.202874 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.202856 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.205257 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.205234 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 20:05:45.205257 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.205249 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-nq9vm\"" Apr 17 20:05:45.205467 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.205234 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 20:05:45.206097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.206084 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 20:05:45.206162 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.206133 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 20:05:45.209924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.209899 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sw2k4"] Apr 17 20:05:45.330458 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.330419 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnfz\" (UniqueName: \"kubernetes.io/projected/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-kube-api-access-zgnfz\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.330659 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.330578 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-signing-cabundle\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.330659 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.330621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-signing-key\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.431539 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.431493 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-signing-cabundle\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.431735 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.431548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-signing-key\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.431735 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.431697 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnfz\" (UniqueName: \"kubernetes.io/projected/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-kube-api-access-zgnfz\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.432324 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.432302 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-signing-cabundle\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.434163 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.434130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-signing-key\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.439225 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.439197 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnfz\" (UniqueName: \"kubernetes.io/projected/e01d66e7-7c1f-49cf-a82e-bbd687f019e0-kube-api-access-zgnfz\") pod \"service-ca-865cb79987-sw2k4\" (UID: \"e01d66e7-7c1f-49cf-a82e-bbd687f019e0\") " pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.512330 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.512229 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sw2k4" Apr 17 20:05:45.557741 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.557697 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bqprb"] Apr 17 20:05:45.564196 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.564164 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.567164 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.566879 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:05:45.567164 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.566977 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zcfv7\"" Apr 17 20:05:45.567164 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.566877 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:05:45.569487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.569442 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bqprb"] Apr 17 20:05:45.603205 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.603164 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq" event={"ID":"f96a9e59-b624-4850-ab85-b3968aa4f8b3","Type":"ContainerStarted","Data":"0f279fe886769d243cd002d93cf45589ff09e02ebe8eb4e9bc9711ca182b10e1"} Apr 17 20:05:45.603205 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.603210 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq" event={"ID":"f96a9e59-b624-4850-ab85-b3968aa4f8b3","Type":"ContainerStarted","Data":"546d9fc8282367a8a9af3870a2fcff9a20b502471e006eb8c86631df57844583"} Apr 17 20:05:45.633140 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.633106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4436becd-7f00-417c-82b8-a06a4171ec21-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.633311 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.633155 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bz2\" (UniqueName: \"kubernetes.io/projected/4436becd-7f00-417c-82b8-a06a4171ec21-kube-api-access-42bz2\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.633311 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.633189 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4436becd-7f00-417c-82b8-a06a4171ec21-data-volume\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.633311 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.633261 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4436becd-7f00-417c-82b8-a06a4171ec21-crio-socket\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.633311 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.633300 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.637832 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.637780 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-q6svq" podStartSLOduration=1.637760774 podStartE2EDuration="1.637760774s" podCreationTimestamp="2026-04-17 20:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:05:45.617832304 +0000 UTC m=+93.957986649" watchObservedRunningTime="2026-04-17 20:05:45.637760774 +0000 UTC m=+93.977915126" Apr 17 20:05:45.638253 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.638235 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sw2k4"] Apr 17 20:05:45.641406 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:05:45.641357 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode01d66e7_7c1f_49cf_a82e_bbd687f019e0.slice/crio-da7590c4addb9e8d2c6d022e91573a3319172d6c01808054cf7db7808b981df5 WatchSource:0}: Error finding container da7590c4addb9e8d2c6d022e91573a3319172d6c01808054cf7db7808b981df5: Status 404 returned error can't find the container with id da7590c4addb9e8d2c6d022e91573a3319172d6c01808054cf7db7808b981df5 Apr 17 20:05:45.734837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.734473 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4436becd-7f00-417c-82b8-a06a4171ec21-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.734837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.734577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42bz2\" (UniqueName: \"kubernetes.io/projected/4436becd-7f00-417c-82b8-a06a4171ec21-kube-api-access-42bz2\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.734837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.734620 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4436becd-7f00-417c-82b8-a06a4171ec21-data-volume\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.734837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.734674 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4436becd-7f00-417c-82b8-a06a4171ec21-crio-socket\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.734837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.734728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.734837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.734770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4436becd-7f00-417c-82b8-a06a4171ec21-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.738363 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.736049 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4436becd-7f00-417c-82b8-a06a4171ec21-crio-socket\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.738363 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.736474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4436becd-7f00-417c-82b8-a06a4171ec21-data-volume\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:45.738363 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:45.736503 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:45.738363 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:45.736571 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls podName:4436becd-7f00-417c-82b8-a06a4171ec21 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:46.23655011 +0000 UTC m=+94.576704442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bqprb" (UID: "4436becd-7f00-417c-82b8-a06a4171ec21") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:45.747481 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:45.747454 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42bz2\" (UniqueName: \"kubernetes.io/projected/4436becd-7f00-417c-82b8-a06a4171ec21-kube-api-access-42bz2\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:46.240298 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:46.240260 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:46.240501 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:46.240431 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:46.240501 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:46.240496 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls podName:4436becd-7f00-417c-82b8-a06a4171ec21 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:47.24047984 +0000 UTC m=+95.580634169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bqprb" (UID: "4436becd-7f00-417c-82b8-a06a4171ec21") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:46.607388 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:46.607300 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sw2k4" event={"ID":"e01d66e7-7c1f-49cf-a82e-bbd687f019e0","Type":"ContainerStarted","Data":"fbe5542812354f1a75e139ff77a2974e744d26ac5f60518e32ed3375bd1f6074"} Apr 17 20:05:46.607388 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:46.607339 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sw2k4" event={"ID":"e01d66e7-7c1f-49cf-a82e-bbd687f019e0","Type":"ContainerStarted","Data":"da7590c4addb9e8d2c6d022e91573a3319172d6c01808054cf7db7808b981df5"} Apr 17 20:05:46.624860 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:46.624808 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-sw2k4" podStartSLOduration=1.62479343 podStartE2EDuration="1.62479343s" podCreationTimestamp="2026-04-17 20:05:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:05:46.623819843 +0000 UTC m=+94.963974193" watchObservedRunningTime="2026-04-17 20:05:46.62479343 +0000 UTC m=+94.964947781" Apr 17 20:05:47.249561 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:47.249520 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:47.250017 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:47.249692 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:47.250133 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:47.250083 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls podName:4436becd-7f00-417c-82b8-a06a4171ec21 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:49.250056775 +0000 UTC m=+97.590211104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bqprb" (UID: "4436becd-7f00-417c-82b8-a06a4171ec21") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:47.955455 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:47.955413 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:05:47.955670 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:47.955537 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:05:47.955670 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:47.955579 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:05:47.955670 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:47.955657 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:05:47.955820 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:47.955675 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert podName:37414adb-2a0d-4af9-93ad-64cc2ea178e7 nodeName:}" failed. No retries permitted until 2026-04-17 20:06:51.955654013 +0000 UTC m=+160.295808357 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert") pod "ingress-canary-qpnwl" (UID: "37414adb-2a0d-4af9-93ad-64cc2ea178e7") : secret "canary-serving-cert" not found Apr 17 20:05:47.955820 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:47.955707 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls podName:ea635e92-8024-48e9-9b19-6fbeddfe380a nodeName:}" failed. No retries permitted until 2026-04-17 20:06:51.955691167 +0000 UTC m=+160.295845502 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls") pod "dns-default-nrnx5" (UID: "ea635e92-8024-48e9-9b19-6fbeddfe380a") : secret "dns-default-metrics-tls" not found Apr 17 20:05:48.115777 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:48.115736 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:48.116164 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:48.116149 2568 scope.go:117] "RemoveContainer" containerID="31482ba963cefb31ee430a24d81bf6a85d71cba83f83cad3c4770bc227f0b072" Apr 17 20:05:48.116329 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:48.116312 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bg9xh_openshift-console-operator(5c003df9-b811-4f77-9d0a-01312bf9421d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" podUID="5c003df9-b811-4f77-9d0a-01312bf9421d" Apr 17 20:05:49.266485 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:49.266443 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:49.266871 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:49.266608 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 20:05:49.266871 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:49.266690 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls podName:4436becd-7f00-417c-82b8-a06a4171ec21 nodeName:}" failed. No retries permitted until 2026-04-17 20:05:53.266674024 +0000 UTC m=+101.606828352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls") pod "insights-runtime-extractor-bqprb" (UID: "4436becd-7f00-417c-82b8-a06a4171ec21") : secret "insights-runtime-extractor-tls" not found Apr 17 20:05:50.579488 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:50.579448 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:05:50.579870 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:50.579848 2568 scope.go:117] "RemoveContainer" containerID="31482ba963cefb31ee430a24d81bf6a85d71cba83f83cad3c4770bc227f0b072" Apr 17 20:05:50.580041 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:05:50.580023 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bg9xh_openshift-console-operator(5c003df9-b811-4f77-9d0a-01312bf9421d)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" podUID="5c003df9-b811-4f77-9d0a-01312bf9421d" Apr 17 20:05:52.590635 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:52.590594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:52.592983 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:52.592956 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d884fe32-5577-4c53-8e66-8d523b9000c9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lgk8z\" (UID: \"d884fe32-5577-4c53-8e66-8d523b9000c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:52.653825 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:52.653789 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" Apr 17 20:05:52.768945 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:52.768913 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z"] Apr 17 20:05:53.296253 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:53.296211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:53.298632 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:53.298597 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4436becd-7f00-417c-82b8-a06a4171ec21-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqprb\" (UID: \"4436becd-7f00-417c-82b8-a06a4171ec21\") " pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:53.379140 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:53.379096 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bqprb" Apr 17 20:05:53.499618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:53.499585 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bqprb"] Apr 17 20:05:53.503872 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:05:53.503844 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4436becd_7f00_417c_82b8_a06a4171ec21.slice/crio-c37856418fd908d8b550f7441647797deec181839ece00050c990cb6954bb4f7 WatchSource:0}: Error finding container c37856418fd908d8b550f7441647797deec181839ece00050c990cb6954bb4f7: Status 404 returned error can't find the container with id c37856418fd908d8b550f7441647797deec181839ece00050c990cb6954bb4f7 Apr 17 20:05:53.625384 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:53.625345 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" event={"ID":"d884fe32-5577-4c53-8e66-8d523b9000c9","Type":"ContainerStarted","Data":"f1e2da32c96adcf01078290c168b55370c996d20f5e84d305a0c3eb8d311366a"} Apr 17 20:05:53.626732 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:53.626709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bqprb" event={"ID":"4436becd-7f00-417c-82b8-a06a4171ec21","Type":"ContainerStarted","Data":"e5cdfec403d3d3325162851b177875ddf49277fa26edd858cbcc11e7a784add8"} Apr 17 20:05:53.626809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:53.626740 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bqprb" event={"ID":"4436becd-7f00-417c-82b8-a06a4171ec21","Type":"ContainerStarted","Data":"c37856418fd908d8b550f7441647797deec181839ece00050c990cb6954bb4f7"} Apr 17 20:05:54.630834 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:54.630804 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bqprb" event={"ID":"4436becd-7f00-417c-82b8-a06a4171ec21","Type":"ContainerStarted","Data":"a1b607613cd757ba70d0a21bb2aafc8cb2668f8445700901004235fafae2ca0d"} Apr 17 20:05:56.637556 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:56.637511 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" event={"ID":"d884fe32-5577-4c53-8e66-8d523b9000c9","Type":"ContainerStarted","Data":"3155b93ac1b24c426ceecd9e31f1849e458595f387e4129f020c724bb595e67f"} Apr 17 20:05:56.637556 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:56.637554 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" event={"ID":"d884fe32-5577-4c53-8e66-8d523b9000c9","Type":"ContainerStarted","Data":"673f0cc848fe993d7e959f5fe44dc77a95da0867a1fb888b6d3cf73faeb29298"} Apr 17 20:05:56.639275 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:56.639249 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bqprb" event={"ID":"4436becd-7f00-417c-82b8-a06a4171ec21","Type":"ContainerStarted","Data":"f289d3822e9101391ead2ef8e4516cc6c2e20af972c11f84439cb0ce19c0cab5"} Apr 17 20:05:56.652872 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:56.652818 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lgk8z" podStartSLOduration=17.651985186 podStartE2EDuration="20.652804123s" podCreationTimestamp="2026-04-17 20:05:36 +0000 UTC" firstStartedPulling="2026-04-17 20:05:52.809875796 +0000 UTC m=+101.150030126" lastFinishedPulling="2026-04-17 20:05:55.810694734 +0000 UTC m=+104.150849063" observedRunningTime="2026-04-17 20:05:56.6513607 +0000 UTC m=+104.991515051" watchObservedRunningTime="2026-04-17 20:05:56.652804123 +0000 UTC m=+104.992958473" Apr 17 20:05:56.666543 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:05:56.666476 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bqprb" podStartSLOduration=9.417511262 podStartE2EDuration="11.666461217s" podCreationTimestamp="2026-04-17 20:05:45 +0000 UTC" firstStartedPulling="2026-04-17 20:05:53.563086128 +0000 UTC m=+101.903240458" lastFinishedPulling="2026-04-17 20:05:55.812036081 +0000 UTC m=+104.152190413" observedRunningTime="2026-04-17 20:05:56.665828321 +0000 UTC m=+105.005982676" watchObservedRunningTime="2026-04-17 20:05:56.666461217 +0000 UTC m=+105.006615568" Apr 17 20:06:02.264756 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:02.264719 2568 scope.go:117] "RemoveContainer" containerID="31482ba963cefb31ee430a24d81bf6a85d71cba83f83cad3c4770bc227f0b072" Apr 17 20:06:02.658972 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:02.658894 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:06:02.658972 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:02.658956 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" event={"ID":"5c003df9-b811-4f77-9d0a-01312bf9421d","Type":"ContainerStarted","Data":"1aa1f94911ab791425ccb9be1fc72dd06ffe9fd02e84e592bd35d1f71f58e9d7"} Apr 17 20:06:02.659244 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:02.659227 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:06:02.955866 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:02.955834 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-bg9xh" Apr 17 20:06:03.815351 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.815314 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g"] Apr 17 20:06:03.819906 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.819867 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" Apr 17 20:06:03.822214 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.822190 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 20:06:03.822336 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.822244 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 20:06:03.823172 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.823155 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-nqtnp\"" Apr 17 20:06:03.827495 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.827474 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g"] Apr 17 20:06:03.843632 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.843604 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8555ccc55b-vmfxx"] Apr 17 20:06:03.846636 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.846613 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:03.849115 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.849078 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 20:06:03.849247 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.849143 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rkflr\"" Apr 17 20:06:03.849247 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.849160 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 20:06:03.849247 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.849144 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 20:06:03.855152 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.855133 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 20:06:03.858578 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.858542 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8555ccc55b-vmfxx"] Apr 17 20:06:03.982463 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982424 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-registry-tls\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:03.982463 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982462 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nzjh\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-kube-api-access-8nzjh\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:03.982680 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982482 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50f4a009-e41c-46c1-b6f2-bce51899dc5c-ca-trust-extracted\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:03.982680 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50f4a009-e41c-46c1-b6f2-bce51899dc5c-registry-certificates\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:03.982680 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982622 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50f4a009-e41c-46c1-b6f2-bce51899dc5c-installation-pull-secrets\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:03.982680 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982671 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/863fd5e8-913c-4fd5-925c-77821846ba00-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fvm6g\" (UID: \"863fd5e8-913c-4fd5-925c-77821846ba00\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" Apr 17 20:06:03.982820 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982706 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/863fd5e8-913c-4fd5-925c-77821846ba00-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fvm6g\" (UID: \"863fd5e8-913c-4fd5-925c-77821846ba00\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" Apr 17 20:06:03.982820 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982776 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-bound-sa-token\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:03.982820 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982805 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50f4a009-e41c-46c1-b6f2-bce51899dc5c-trusted-ca\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:03.982915 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:03.982839 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50f4a009-e41c-46c1-b6f2-bce51899dc5c-image-registry-private-configuration\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083595 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-bound-sa-token\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083595 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083556 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50f4a009-e41c-46c1-b6f2-bce51899dc5c-trusted-ca\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50f4a009-e41c-46c1-b6f2-bce51899dc5c-image-registry-private-configuration\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083653 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-registry-tls\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nzjh\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-kube-api-access-8nzjh\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50f4a009-e41c-46c1-b6f2-bce51899dc5c-ca-trust-extracted\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50f4a009-e41c-46c1-b6f2-bce51899dc5c-registry-certificates\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50f4a009-e41c-46c1-b6f2-bce51899dc5c-installation-pull-secrets\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.083814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/863fd5e8-913c-4fd5-925c-77821846ba00-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fvm6g\" (UID: \"863fd5e8-913c-4fd5-925c-77821846ba00\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" Apr 17 20:06:04.084145 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.083946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/863fd5e8-913c-4fd5-925c-77821846ba00-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fvm6g\" (UID: \"863fd5e8-913c-4fd5-925c-77821846ba00\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" Apr 17 20:06:04.084433 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.084334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/50f4a009-e41c-46c1-b6f2-bce51899dc5c-ca-trust-extracted\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.084630 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.084606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/863fd5e8-913c-4fd5-925c-77821846ba00-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fvm6g\" (UID: \"863fd5e8-913c-4fd5-925c-77821846ba00\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" Apr 17 20:06:04.084755 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.084724 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/50f4a009-e41c-46c1-b6f2-bce51899dc5c-registry-certificates\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.084843 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.084739 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50f4a009-e41c-46c1-b6f2-bce51899dc5c-trusted-ca\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.086775 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.086744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/50f4a009-e41c-46c1-b6f2-bce51899dc5c-installation-pull-secrets\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.086865 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.086797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/50f4a009-e41c-46c1-b6f2-bce51899dc5c-image-registry-private-configuration\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.086906 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.086882 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/863fd5e8-913c-4fd5-925c-77821846ba00-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fvm6g\" (UID: \"863fd5e8-913c-4fd5-925c-77821846ba00\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" Apr 17 20:06:04.086968 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.086952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-registry-tls\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.092278 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.092249 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-bound-sa-token\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.092614 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.092591 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nzjh\" (UniqueName: \"kubernetes.io/projected/50f4a009-e41c-46c1-b6f2-bce51899dc5c-kube-api-access-8nzjh\") pod \"image-registry-8555ccc55b-vmfxx\" (UID: \"50f4a009-e41c-46c1-b6f2-bce51899dc5c\") " pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.129243 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.129205 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" Apr 17 20:06:04.155611 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.155575 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.281278 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.281229 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g"] Apr 17 20:06:04.285170 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:04.285135 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod863fd5e8_913c_4fd5_925c_77821846ba00.slice/crio-b45fd19f8a78be0a20d7352305efd74fa4e25ad916f1e0c19e7ebff996231c7a WatchSource:0}: Error finding container b45fd19f8a78be0a20d7352305efd74fa4e25ad916f1e0c19e7ebff996231c7a: Status 404 returned error can't find the container with id b45fd19f8a78be0a20d7352305efd74fa4e25ad916f1e0c19e7ebff996231c7a Apr 17 20:06:04.302015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.301987 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8555ccc55b-vmfxx"] Apr 17 20:06:04.305063 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:04.305037 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50f4a009_e41c_46c1_b6f2_bce51899dc5c.slice/crio-cc1582ca7dd1ab8e13188abb06409fe61c225a06d0fadffbaf825c8106316c6d WatchSource:0}: Error finding container cc1582ca7dd1ab8e13188abb06409fe61c225a06d0fadffbaf825c8106316c6d: Status 404 returned error can't find the container with id cc1582ca7dd1ab8e13188abb06409fe61c225a06d0fadffbaf825c8106316c6d Apr 17 20:06:04.666244 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.666149 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" event={"ID":"50f4a009-e41c-46c1-b6f2-bce51899dc5c","Type":"ContainerStarted","Data":"4b193b403edb142bc686780d4c528f512e8857a98d5240fc107ad15bcee07d2d"} Apr 17 20:06:04.666244 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.666190 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" event={"ID":"50f4a009-e41c-46c1-b6f2-bce51899dc5c","Type":"ContainerStarted","Data":"cc1582ca7dd1ab8e13188abb06409fe61c225a06d0fadffbaf825c8106316c6d"} Apr 17 20:06:04.666487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.666266 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:04.667295 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.667274 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" event={"ID":"863fd5e8-913c-4fd5-925c-77821846ba00","Type":"ContainerStarted","Data":"b45fd19f8a78be0a20d7352305efd74fa4e25ad916f1e0c19e7ebff996231c7a"} Apr 17 20:06:04.684853 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:04.684780 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" podStartSLOduration=1.684758416 podStartE2EDuration="1.684758416s" podCreationTimestamp="2026-04-17 20:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:06:04.683370498 +0000 UTC m=+113.023524875" watchObservedRunningTime="2026-04-17 20:06:04.684758416 +0000 UTC m=+113.024912768" Apr 17 20:06:05.671236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:05.671136 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" event={"ID":"863fd5e8-913c-4fd5-925c-77821846ba00","Type":"ContainerStarted","Data":"a7f1a891e09bab926d7fea4e4ecc913839f41f12d92f490817162c6931fb16de"} Apr 17 20:06:05.687338 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:05.687283 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fvm6g" podStartSLOduration=1.748945201 podStartE2EDuration="2.687267577s" podCreationTimestamp="2026-04-17 20:06:03 +0000 UTC" firstStartedPulling="2026-04-17 20:06:04.287057518 +0000 UTC m=+112.627211847" lastFinishedPulling="2026-04-17 20:06:05.225379895 +0000 UTC m=+113.565534223" observedRunningTime="2026-04-17 20:06:05.685753779 +0000 UTC m=+114.025908130" watchObservedRunningTime="2026-04-17 20:06:05.687267577 +0000 UTC m=+114.027421925" Apr 17 20:06:11.978545 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.978501 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm"] Apr 17 20:06:11.983539 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.983510 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2fz8b"] Apr 17 20:06:11.983678 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.983657 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:11.986187 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.986160 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:06:11.986332 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.986199 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:06:11.986332 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.986242 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 20:06:11.986332 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.986200 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:06:11.986519 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.986436 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-ts2wx\"" Apr 17 20:06:11.986576 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.986539 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:06:11.987258 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.987236 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:11.989617 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.989597 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 20:06:11.991111 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.989962 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:06:11.991111 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.990387 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 20:06:11.991111 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:11.990639 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-678vp\"" Apr 17 20:06:12.001165 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.001143 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm"] Apr 17 20:06:12.002939 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.002889 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2fz8b"] Apr 17 20:06:12.005560 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.005537 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-w24f4"] Apr 17 20:06:12.009865 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.009840 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.012460 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.012437 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:06:12.012578 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.012536 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:06:12.012578 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.012568 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ntvkb\"" Apr 17 20:06:12.012865 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.012849 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:06:12.096892 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.096858 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d69cfc6df-4lvmh"] Apr 17 20:06:12.100683 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.100654 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.103454 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.103418 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 20:06:12.103454 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.103444 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 20:06:12.103454 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.103455 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 20:06:12.103714 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.103625 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 20:06:12.103714 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.103702 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 20:06:12.104110 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.104088 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 20:06:12.104230 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.104220 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-bmfnl\"" Apr 17 20:06:12.104308 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.104292 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 20:06:12.111004 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.110981 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d69cfc6df-4lvmh"] Apr 17 20:06:12.154698 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154666 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-tls\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.154698 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.154943 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154736 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-accelerators-collector-config\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.154943 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154766 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e756761d-4549-4f85-949c-f03278c10be7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.154943 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154803 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-textfile\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.154943 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154830 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.154943 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.154943 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154931 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.155236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.154975 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e756761d-4549-4f85-949c-f03278c10be7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.155236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155021 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-wtmp\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.155236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155046 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8020047-5203-41aa-b91c-7a729e686edb-metrics-client-ca\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.155236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155104 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.155236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ctl9\" (UniqueName: \"kubernetes.io/projected/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-api-access-8ctl9\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.155236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155193 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8ww4\" (UniqueName: \"kubernetes.io/projected/d8020047-5203-41aa-b91c-7a729e686edb-kube-api-access-f8ww4\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.155236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155217 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e756761d-4549-4f85-949c-f03278c10be7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.155647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155245 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-sys\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.155647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27hp2\" (UniqueName: \"kubernetes.io/projected/e756761d-4549-4f85-949c-f03278c10be7-kube-api-access-27hp2\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.155647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155343 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-root\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.155647 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.155376 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.256676 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256579 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-root\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.256676 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-config\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.256676 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256664 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-oauth-serving-cert\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.256957 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-root\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.256957 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256701 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.256957 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-serving-cert\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.256957 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-oauth-config\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.256957 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256815 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-tls\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.256957 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.256957 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256904 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-accelerators-collector-config\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.256957 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256936 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e756761d-4549-4f85-949c-f03278c10be7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.256971 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-textfile\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257001 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257093 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cfkw\" (UniqueName: \"kubernetes.io/projected/2bcc2590-ce6d-46f5-9bb5-44573329abe4-kube-api-access-2cfkw\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257146 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257181 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e756761d-4549-4f85-949c-f03278c10be7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-wtmp\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257235 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8020047-5203-41aa-b91c-7a729e686edb-metrics-client-ca\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.257346 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257308 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257351 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ctl9\" (UniqueName: \"kubernetes.io/projected/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-api-access-8ctl9\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257429 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8ww4\" (UniqueName: \"kubernetes.io/projected/d8020047-5203-41aa-b91c-7a729e686edb-kube-api-access-f8ww4\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257458 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e756761d-4549-4f85-949c-f03278c10be7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-sys\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257515 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-wtmp\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27hp2\" (UniqueName: \"kubernetes.io/projected/e756761d-4549-4f85-949c-f03278c10be7-kube-api-access-27hp2\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257562 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-service-ca\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.257776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257664 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8020047-5203-41aa-b91c-7a729e686edb-sys\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.258043 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.257910 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-textfile\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.259626 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.259604 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:06:12.259757 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.259608 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 20:06:12.259938 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.259922 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:06:12.259986 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.259922 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:06:12.260236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.260220 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 20:06:12.260288 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.260265 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:06:12.260288 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.260278 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 20:06:12.260382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.260288 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:06:12.260382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.260266 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 20:06:12.265085 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.265064 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:06:12.267923 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.267893 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-accelerators-collector-config\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.268107 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.268079 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.268257 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.268091 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.268341 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.268321 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e756761d-4549-4f85-949c-f03278c10be7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.268618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.268590 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8020047-5203-41aa-b91c-7a729e686edb-metrics-client-ca\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.269878 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.269851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e756761d-4549-4f85-949c-f03278c10be7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.270036 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.270012 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.270148 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.270127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.270307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.270284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.270374 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.270329 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8020047-5203-41aa-b91c-7a729e686edb-node-exporter-tls\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.270784 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.270766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e756761d-4549-4f85-949c-f03278c10be7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.275674 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.275655 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:06:12.286943 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.286913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8ww4\" (UniqueName: \"kubernetes.io/projected/d8020047-5203-41aa-b91c-7a729e686edb-kube-api-access-f8ww4\") pod \"node-exporter-w24f4\" (UID: \"d8020047-5203-41aa-b91c-7a729e686edb\") " pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.287063 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.286923 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27hp2\" (UniqueName: \"kubernetes.io/projected/e756761d-4549-4f85-949c-f03278c10be7-kube-api-access-27hp2\") pod \"openshift-state-metrics-9d44df66c-5pkvm\" (UID: \"e756761d-4549-4f85-949c-f03278c10be7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.287063 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.286923 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ctl9\" (UniqueName: \"kubernetes.io/projected/d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7-kube-api-access-8ctl9\") pod \"kube-state-metrics-69db897b98-2fz8b\" (UID: \"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.305664 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.305630 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-ts2wx\"" Apr 17 20:06:12.312575 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.312553 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-678vp\"" Apr 17 20:06:12.313545 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.313526 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" Apr 17 20:06:12.321390 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.321364 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" Apr 17 20:06:12.323864 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.323842 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ntvkb\"" Apr 17 20:06:12.331845 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.331812 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w24f4" Apr 17 20:06:12.341272 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:12.341237 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8020047_5203_41aa_b91c_7a729e686edb.slice/crio-638887e3bc6bd391406c06bcccf4c5b31b015d42e09a228b581bb4c80894ba96 WatchSource:0}: Error finding container 638887e3bc6bd391406c06bcccf4c5b31b015d42e09a228b581bb4c80894ba96: Status 404 returned error can't find the container with id 638887e3bc6bd391406c06bcccf4c5b31b015d42e09a228b581bb4c80894ba96 Apr 17 20:06:12.358097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.358048 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-service-ca\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.358259 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.358117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-config\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.358259 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.358142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-oauth-serving-cert\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.358259 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.358176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-serving-cert\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.358259 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.358205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-oauth-config\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.358410 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.358262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cfkw\" (UniqueName: \"kubernetes.io/projected/2bcc2590-ce6d-46f5-9bb5-44573329abe4-kube-api-access-2cfkw\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.360874 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.360827 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 20:06:12.360874 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.360867 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-oauth-config\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.361082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.360884 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 20:06:12.361082 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.360972 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 20:06:12.361222 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.361206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-serving-cert\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.366028 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.365992 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 20:06:12.369885 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.369823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-service-ca\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.370480 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.370425 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-oauth-serving-cert\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.370597 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.370505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-config\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.378826 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.378785 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cfkw\" (UniqueName: \"kubernetes.io/projected/2bcc2590-ce6d-46f5-9bb5-44573329abe4-kube-api-access-2cfkw\") pod \"console-7d69cfc6df-4lvmh\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.414047 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.414014 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:12.463370 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.462898 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm"] Apr 17 20:06:12.465235 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:12.465183 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode756761d_4549_4f85_949c_f03278c10be7.slice/crio-a67396efb0a0dbbf6347d92a9c6179a761605b6dea9b90892ad95293d2b0e8db WatchSource:0}: Error finding container a67396efb0a0dbbf6347d92a9c6179a761605b6dea9b90892ad95293d2b0e8db: Status 404 returned error can't find the container with id a67396efb0a0dbbf6347d92a9c6179a761605b6dea9b90892ad95293d2b0e8db Apr 17 20:06:12.486178 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.486149 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2fz8b"] Apr 17 20:06:12.489590 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:12.489541 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a312e9_fdd6_4c8b_a92a_26bb4d05e5e7.slice/crio-37f46bfac598450a972a7a7c05ecf918136d953bfba7626e3a7ef4783ebcaa5b WatchSource:0}: Error finding container 37f46bfac598450a972a7a7c05ecf918136d953bfba7626e3a7ef4783ebcaa5b: Status 404 returned error can't find the container with id 37f46bfac598450a972a7a7c05ecf918136d953bfba7626e3a7ef4783ebcaa5b Apr 17 20:06:12.561480 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.561449 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d69cfc6df-4lvmh"] Apr 17 20:06:12.567553 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:12.567522 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bcc2590_ce6d_46f5_9bb5_44573329abe4.slice/crio-2b88ea7643d0a20ab8272c8914ef441e22264e94a13f13070c4599b7040a18ce WatchSource:0}: Error finding container 2b88ea7643d0a20ab8272c8914ef441e22264e94a13f13070c4599b7040a18ce: Status 404 returned error can't find the container with id 2b88ea7643d0a20ab8272c8914ef441e22264e94a13f13070c4599b7040a18ce Apr 17 20:06:12.691723 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.691681 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d69cfc6df-4lvmh" event={"ID":"2bcc2590-ce6d-46f5-9bb5-44573329abe4","Type":"ContainerStarted","Data":"2b88ea7643d0a20ab8272c8914ef441e22264e94a13f13070c4599b7040a18ce"} Apr 17 20:06:12.692706 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.692677 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w24f4" event={"ID":"d8020047-5203-41aa-b91c-7a729e686edb","Type":"ContainerStarted","Data":"638887e3bc6bd391406c06bcccf4c5b31b015d42e09a228b581bb4c80894ba96"} Apr 17 20:06:12.693742 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.693713 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" event={"ID":"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7","Type":"ContainerStarted","Data":"37f46bfac598450a972a7a7c05ecf918136d953bfba7626e3a7ef4783ebcaa5b"} Apr 17 20:06:12.695297 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.695274 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" event={"ID":"e756761d-4549-4f85-949c-f03278c10be7","Type":"ContainerStarted","Data":"e27fe6b0d392643bc344e8a0670da9b1f7fa66465e2875bd6850126859a771fa"} Apr 17 20:06:12.695390 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.695301 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" event={"ID":"e756761d-4549-4f85-949c-f03278c10be7","Type":"ContainerStarted","Data":"fb99aa0e11800caf59522ee9f97afcd6fd7381f781b490958add0ed04fb0f004"} Apr 17 20:06:12.695390 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:12.695310 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" event={"ID":"e756761d-4549-4f85-949c-f03278c10be7","Type":"ContainerStarted","Data":"a67396efb0a0dbbf6347d92a9c6179a761605b6dea9b90892ad95293d2b0e8db"} Apr 17 20:06:13.700980 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.700939 2568 generic.go:358] "Generic (PLEG): container finished" podID="d8020047-5203-41aa-b91c-7a729e686edb" containerID="8468a0bb4d2d2097703b803bd1ae1f9732e601b535d39ceb2ade6b432ef903e0" exitCode=0 Apr 17 20:06:13.701453 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.701041 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w24f4" event={"ID":"d8020047-5203-41aa-b91c-7a729e686edb","Type":"ContainerDied","Data":"8468a0bb4d2d2097703b803bd1ae1f9732e601b535d39ceb2ade6b432ef903e0"} Apr 17 20:06:13.971477 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.971371 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-66d48c7756-dzkf6"] Apr 17 20:06:13.975764 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.975736 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:13.978356 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.978332 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 20:06:13.978518 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.978369 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dsp48hnmnf5b3\"" Apr 17 20:06:13.978518 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.978416 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 20:06:13.978518 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.978444 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 20:06:13.978518 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.978334 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 20:06:13.978791 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.978759 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 20:06:13.978850 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.978798 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-wlkqz\"" Apr 17 20:06:13.985349 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:13.985312 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-66d48c7756-dzkf6"] Apr 17 20:06:14.073175 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.073066 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.073175 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.073107 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.073304 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.073221 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.073304 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.073269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-tls\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.073482 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.073340 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-grpc-tls\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.073482 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.073379 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4lv9\" (UniqueName: \"kubernetes.io/projected/f7e56ce5-d56a-4759-ac37-5ab3784db08c-kube-api-access-l4lv9\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.073584 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.073484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.073584 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.073560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e56ce5-d56a-4759-ac37-5ab3784db08c-metrics-client-ca\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.173948 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.173925 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.174068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.173972 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-tls\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.174068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.174016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-grpc-tls\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.174068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.174057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4lv9\" (UniqueName: \"kubernetes.io/projected/f7e56ce5-d56a-4759-ac37-5ab3784db08c-kube-api-access-l4lv9\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.174219 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.174192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.174268 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.174252 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e56ce5-d56a-4759-ac37-5ab3784db08c-metrics-client-ca\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.174316 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.174290 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.174368 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.174323 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.175756 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.175698 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e56ce5-d56a-4759-ac37-5ab3784db08c-metrics-client-ca\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.177643 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.177363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-grpc-tls\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.177643 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.177580 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.177872 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.177775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.178031 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.178005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.178440 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.178390 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.180135 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.179300 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f7e56ce5-d56a-4759-ac37-5ab3784db08c-secret-thanos-querier-tls\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.186616 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.185738 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4lv9\" (UniqueName: \"kubernetes.io/projected/f7e56ce5-d56a-4759-ac37-5ab3784db08c-kube-api-access-l4lv9\") pod \"thanos-querier-66d48c7756-dzkf6\" (UID: \"f7e56ce5-d56a-4759-ac37-5ab3784db08c\") " pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.288389 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.288190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:14.461468 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.461436 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-66d48c7756-dzkf6"] Apr 17 20:06:14.465956 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:14.465912 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e56ce5_d56a_4759_ac37_5ab3784db08c.slice/crio-ad05fe6a04094956b039e49665793b55df18da3e9b832ec9a4095c7c34ae257c WatchSource:0}: Error finding container ad05fe6a04094956b039e49665793b55df18da3e9b832ec9a4095c7c34ae257c: Status 404 returned error can't find the container with id ad05fe6a04094956b039e49665793b55df18da3e9b832ec9a4095c7c34ae257c Apr 17 20:06:14.706979 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.706934 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w24f4" event={"ID":"d8020047-5203-41aa-b91c-7a729e686edb","Type":"ContainerStarted","Data":"d9f7f247e58d3b62bcaf8d66814c81b0a81e2bb582162de2a590e4e34ab60fec"} Apr 17 20:06:14.707452 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.706987 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w24f4" event={"ID":"d8020047-5203-41aa-b91c-7a729e686edb","Type":"ContainerStarted","Data":"9548199b7b2e29ddbee69f495496130f2d04056f5e8b14f7d68ebcbd44cf1090"} Apr 17 20:06:14.708197 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.708170 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" event={"ID":"f7e56ce5-d56a-4759-ac37-5ab3784db08c","Type":"ContainerStarted","Data":"ad05fe6a04094956b039e49665793b55df18da3e9b832ec9a4095c7c34ae257c"} Apr 17 20:06:14.710110 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.710067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" event={"ID":"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7","Type":"ContainerStarted","Data":"67dd74700f449b70f8f7eb27a536f6a0a0c825ea02c0d28be0b7754e85f306d4"} Apr 17 20:06:14.710110 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.710102 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" event={"ID":"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7","Type":"ContainerStarted","Data":"a82210af9e8bed4642c242b5023aa54d35409b5ed29d1afe753a60b03daa6f8d"} Apr 17 20:06:14.710272 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.710117 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" event={"ID":"d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7","Type":"ContainerStarted","Data":"354abdaf458528d0550b3cea40dd5bf9589860f9303c7c0f4a282fe1b06e3a80"} Apr 17 20:06:14.712194 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.712150 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" event={"ID":"e756761d-4549-4f85-949c-f03278c10be7","Type":"ContainerStarted","Data":"d8cc97ff67a907bd8d901738a0f0122b6da670281242fd98199b6df468b51e78"} Apr 17 20:06:14.725302 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.725244 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-w24f4" podStartSLOduration=2.948745595 podStartE2EDuration="3.725224673s" podCreationTimestamp="2026-04-17 20:06:11 +0000 UTC" firstStartedPulling="2026-04-17 20:06:12.343300692 +0000 UTC m=+120.683455024" lastFinishedPulling="2026-04-17 20:06:13.11977977 +0000 UTC m=+121.459934102" observedRunningTime="2026-04-17 20:06:14.723077573 +0000 UTC m=+123.063231924" watchObservedRunningTime="2026-04-17 20:06:14.725224673 +0000 UTC m=+123.065379026" Apr 17 20:06:14.740191 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.740123 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-5pkvm" podStartSLOduration=2.282849099 podStartE2EDuration="3.740107502s" podCreationTimestamp="2026-04-17 20:06:11 +0000 UTC" firstStartedPulling="2026-04-17 20:06:12.620953044 +0000 UTC m=+120.961107373" lastFinishedPulling="2026-04-17 20:06:14.078211443 +0000 UTC m=+122.418365776" observedRunningTime="2026-04-17 20:06:14.73872506 +0000 UTC m=+123.078879413" watchObservedRunningTime="2026-04-17 20:06:14.740107502 +0000 UTC m=+123.080261853" Apr 17 20:06:14.758132 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:14.758070 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-2fz8b" podStartSLOduration=2.174156721 podStartE2EDuration="3.758052668s" podCreationTimestamp="2026-04-17 20:06:11 +0000 UTC" firstStartedPulling="2026-04-17 20:06:12.492313251 +0000 UTC m=+120.832467580" lastFinishedPulling="2026-04-17 20:06:14.076209196 +0000 UTC m=+122.416363527" observedRunningTime="2026-04-17 20:06:14.756610522 +0000 UTC m=+123.096764893" watchObservedRunningTime="2026-04-17 20:06:14.758052668 +0000 UTC m=+123.098207019" Apr 17 20:06:16.721890 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:16.721836 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d69cfc6df-4lvmh" event={"ID":"2bcc2590-ce6d-46f5-9bb5-44573329abe4","Type":"ContainerStarted","Data":"fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702"} Apr 17 20:06:16.741891 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:16.741826 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d69cfc6df-4lvmh" podStartSLOduration=1.44595962 podStartE2EDuration="4.741804057s" podCreationTimestamp="2026-04-17 20:06:12 +0000 UTC" firstStartedPulling="2026-04-17 20:06:12.570107861 +0000 UTC m=+120.910262195" lastFinishedPulling="2026-04-17 20:06:15.865952289 +0000 UTC m=+124.206106632" observedRunningTime="2026-04-17 20:06:16.74112675 +0000 UTC m=+125.081281116" watchObservedRunningTime="2026-04-17 20:06:16.741804057 +0000 UTC m=+125.081958411" Apr 17 20:06:17.216997 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.216959 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-76b4cfb785-v8q64"] Apr 17 20:06:17.221006 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.220982 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.223539 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.223510 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 20:06:17.223672 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.223607 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 20:06:17.224073 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.224047 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 20:06:17.224288 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.224273 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-bzkxp\"" Apr 17 20:06:17.224356 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.224306 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 20:06:17.225045 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.225025 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 20:06:17.232449 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.232418 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 20:06:17.234105 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.234074 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-76b4cfb785-v8q64"] Apr 17 20:06:17.309276 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.309238 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-serving-certs-ca-bundle\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.309502 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.309296 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.309502 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.309435 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.309502 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.309466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-metrics-client-ca\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.309664 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.309580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-telemeter-client-tls\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.309722 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.309662 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-federate-client-tls\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.309722 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.309691 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6p7k\" (UniqueName: \"kubernetes.io/projected/2fdeaaec-d967-4813-a344-1271b1f604b6-kube-api-access-j6p7k\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.309819 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.309733 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-secret-telemeter-client\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411018 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.410963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411018 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.411021 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-metrics-client-ca\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411261 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.411066 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-telemeter-client-tls\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411261 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.411125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-federate-client-tls\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411261 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.411147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6p7k\" (UniqueName: \"kubernetes.io/projected/2fdeaaec-d967-4813-a344-1271b1f604b6-kube-api-access-j6p7k\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411261 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.411188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-secret-telemeter-client\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411261 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.411227 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-serving-certs-ca-bundle\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411543 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.411262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.411954 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.411926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-metrics-client-ca\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.412144 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.412090 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-serving-certs-ca-bundle\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.412254 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.412156 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fdeaaec-d967-4813-a344-1271b1f604b6-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.414411 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.414363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-secret-telemeter-client\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.414505 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.414425 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-telemeter-client-tls\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.414546 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.414510 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.414891 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.414863 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/2fdeaaec-d967-4813-a344-1271b1f604b6-federate-client-tls\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.419596 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.419569 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6p7k\" (UniqueName: \"kubernetes.io/projected/2fdeaaec-d967-4813-a344-1271b1f604b6-kube-api-access-j6p7k\") pod \"telemeter-client-76b4cfb785-v8q64\" (UID: \"2fdeaaec-d967-4813-a344-1271b1f604b6\") " pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.535086 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.534993 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" Apr 17 20:06:17.731705 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:17.731682 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-76b4cfb785-v8q64"] Apr 17 20:06:17.733525 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:17.733499 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fdeaaec_d967_4813_a344_1271b1f604b6.slice/crio-aa5c082055f60c083464fd84a300c3ced8ca52fa89beb23bd7ee04bcd8420aeb WatchSource:0}: Error finding container aa5c082055f60c083464fd84a300c3ced8ca52fa89beb23bd7ee04bcd8420aeb: Status 404 returned error can't find the container with id aa5c082055f60c083464fd84a300c3ced8ca52fa89beb23bd7ee04bcd8420aeb Apr 17 20:06:18.729756 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:18.729720 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" event={"ID":"2fdeaaec-d967-4813-a344-1271b1f604b6","Type":"ContainerStarted","Data":"aa5c082055f60c083464fd84a300c3ced8ca52fa89beb23bd7ee04bcd8420aeb"} Apr 17 20:06:18.731587 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:18.731559 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" event={"ID":"f7e56ce5-d56a-4759-ac37-5ab3784db08c","Type":"ContainerStarted","Data":"112fee6b2cf234ff42b064d7ee623f2b419d1899d60e6e90684bf7c9f71f9a67"} Apr 17 20:06:18.731722 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:18.731593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" event={"ID":"f7e56ce5-d56a-4759-ac37-5ab3784db08c","Type":"ContainerStarted","Data":"2dee190a6c1a16e4705d4a48f9e7c88e82efd06ed3ca717dadfdf41f5c439fc5"} Apr 17 20:06:18.731722 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:18.731605 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" event={"ID":"f7e56ce5-d56a-4759-ac37-5ab3784db08c","Type":"ContainerStarted","Data":"0eb367eb856d59b3206bce4e3802ccf078860f39f56d6bb7cef29b4a635fbdb4"} Apr 17 20:06:19.737718 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:19.737682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" event={"ID":"f7e56ce5-d56a-4759-ac37-5ab3784db08c","Type":"ContainerStarted","Data":"e202b430cb89ae6ffda3b2f3207108c82b41c6cb85d46d6e673f78b9dc14fbeb"} Apr 17 20:06:19.737718 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:19.737724 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" event={"ID":"f7e56ce5-d56a-4759-ac37-5ab3784db08c","Type":"ContainerStarted","Data":"d3a8486d06d99e9fe42d38969c7a672f56438942b9632bd2c57f1fdcf84dbe54"} Apr 17 20:06:19.738203 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:19.737739 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" event={"ID":"f7e56ce5-d56a-4759-ac37-5ab3784db08c","Type":"ContainerStarted","Data":"34528a2cae71bb7e27a019b8809e1eac4d3daf376a48406325783bd3eac20b79"} Apr 17 20:06:19.738203 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:19.737922 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:19.739068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:19.739046 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" event={"ID":"2fdeaaec-d967-4813-a344-1271b1f604b6","Type":"ContainerStarted","Data":"8c9054ecaf4b177feaf499619285f888caa0e871587f49ee2bea7344def0902e"} Apr 17 20:06:19.760198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:19.760093 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" podStartSLOduration=2.43937409 podStartE2EDuration="6.760078865s" podCreationTimestamp="2026-04-17 20:06:13 +0000 UTC" firstStartedPulling="2026-04-17 20:06:14.468576556 +0000 UTC m=+122.808730891" lastFinishedPulling="2026-04-17 20:06:18.789281328 +0000 UTC m=+127.129435666" observedRunningTime="2026-04-17 20:06:19.757779842 +0000 UTC m=+128.097934192" watchObservedRunningTime="2026-04-17 20:06:19.760078865 +0000 UTC m=+128.100233216" Apr 17 20:06:20.743752 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:20.743645 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" event={"ID":"2fdeaaec-d967-4813-a344-1271b1f604b6","Type":"ContainerStarted","Data":"b07e17eeb757f7c7a6c978ad8ab1f9a4ca08ab43c550919b8452c1aaa5208c0a"} Apr 17 20:06:20.743752 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:20.743694 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" event={"ID":"2fdeaaec-d967-4813-a344-1271b1f604b6","Type":"ContainerStarted","Data":"9be955a89c033e1e97dd48f0f37b5b9d00f1e4907c6b8057a8cb554aab5f08b3"} Apr 17 20:06:20.765181 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:20.765128 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-76b4cfb785-v8q64" podStartSLOduration=1.032393737 podStartE2EDuration="3.765110854s" podCreationTimestamp="2026-04-17 20:06:17 +0000 UTC" firstStartedPulling="2026-04-17 20:06:17.735772947 +0000 UTC m=+126.075927294" lastFinishedPulling="2026-04-17 20:06:20.468490074 +0000 UTC m=+128.808644411" observedRunningTime="2026-04-17 20:06:20.762864831 +0000 UTC m=+129.103019183" watchObservedRunningTime="2026-04-17 20:06:20.765110854 +0000 UTC m=+129.105265204" Apr 17 20:06:21.954785 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:21.954745 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:06:21.957068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:21.957044 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3207e4f-83f5-4913-a57e-c29dd6aed2df-metrics-certs\") pod \"network-metrics-daemon-2ctfd\" (UID: \"a3207e4f-83f5-4913-a57e-c29dd6aed2df\") " pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:06:22.177015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.176984 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x8jh2\"" Apr 17 20:06:22.185038 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.185002 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ctfd" Apr 17 20:06:22.318058 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.317994 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2ctfd"] Apr 17 20:06:22.320288 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:22.320239 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3207e4f_83f5_4913_a57e_c29dd6aed2df.slice/crio-6dc885ad2a44e9f2e8326d47b9b81d1c46224c60327c6ce0c3ba6cbac0782f7e WatchSource:0}: Error finding container 6dc885ad2a44e9f2e8326d47b9b81d1c46224c60327c6ce0c3ba6cbac0782f7e: Status 404 returned error can't find the container with id 6dc885ad2a44e9f2e8326d47b9b81d1c46224c60327c6ce0c3ba6cbac0782f7e Apr 17 20:06:22.414873 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.414842 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:22.415049 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.414887 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:22.419512 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.419485 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:22.437953 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.437919 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6746c58bfc-m7bhw"] Apr 17 20:06:22.442331 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.442308 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.450455 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.450170 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 20:06:22.452092 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.452066 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6746c58bfc-m7bhw"] Apr 17 20:06:22.559818 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.559732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-serving-cert\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.559986 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.559869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-console-config\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.559986 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.559924 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlcxb\" (UniqueName: \"kubernetes.io/projected/35cb8783-2a10-4a90-b516-500f71ad6775-kube-api-access-jlcxb\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.559986 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.559959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-oauth-serving-cert\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.560108 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.560059 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-oauth-config\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.560108 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.560088 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-trusted-ca-bundle\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.560181 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.560113 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-service-ca\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.660568 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.660531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-oauth-config\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.660568 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.660567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-trusted-ca-bundle\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.660809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.660588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-service-ca\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.660809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.660651 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-serving-cert\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.660809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.660756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-console-config\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.660958 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.660810 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlcxb\" (UniqueName: \"kubernetes.io/projected/35cb8783-2a10-4a90-b516-500f71ad6775-kube-api-access-jlcxb\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.660958 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.660842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-oauth-serving-cert\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.661361 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.661334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-service-ca\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.661519 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.661494 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-trusted-ca-bundle\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.661584 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.661531 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-console-config\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.661584 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.661549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-oauth-serving-cert\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.663769 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.663712 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-oauth-config\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.664360 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.664341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-serving-cert\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.669984 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.669953 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlcxb\" (UniqueName: \"kubernetes.io/projected/35cb8783-2a10-4a90-b516-500f71ad6775-kube-api-access-jlcxb\") pod \"console-6746c58bfc-m7bhw\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.750531 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.750496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ctfd" event={"ID":"a3207e4f-83f5-4913-a57e-c29dd6aed2df","Type":"ContainerStarted","Data":"6dc885ad2a44e9f2e8326d47b9b81d1c46224c60327c6ce0c3ba6cbac0782f7e"} Apr 17 20:06:22.753277 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.753241 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:22.754603 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.754570 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:22.891996 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:22.891964 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6746c58bfc-m7bhw"] Apr 17 20:06:22.894507 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:22.894477 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35cb8783_2a10_4a90_b516_500f71ad6775.slice/crio-c8f924092a36c4c018671e24475f3aab54c897bd74d07524ed46a906b4cd8f5e WatchSource:0}: Error finding container c8f924092a36c4c018671e24475f3aab54c897bd74d07524ed46a906b4cd8f5e: Status 404 returned error can't find the container with id c8f924092a36c4c018671e24475f3aab54c897bd74d07524ed46a906b4cd8f5e Apr 17 20:06:23.756494 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:23.756194 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6746c58bfc-m7bhw" event={"ID":"35cb8783-2a10-4a90-b516-500f71ad6775","Type":"ContainerStarted","Data":"5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f"} Apr 17 20:06:23.756494 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:23.756243 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6746c58bfc-m7bhw" event={"ID":"35cb8783-2a10-4a90-b516-500f71ad6775","Type":"ContainerStarted","Data":"c8f924092a36c4c018671e24475f3aab54c897bd74d07524ed46a906b4cd8f5e"} Apr 17 20:06:23.758095 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:23.758060 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ctfd" event={"ID":"a3207e4f-83f5-4913-a57e-c29dd6aed2df","Type":"ContainerStarted","Data":"e59077b9b4c6f97ce5168d146eec934b48dfd90323cecfa2c1c6530c591af169"} Apr 17 20:06:23.774255 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:23.774143 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6746c58bfc-m7bhw" podStartSLOduration=1.7741218600000002 podStartE2EDuration="1.77412186s" podCreationTimestamp="2026-04-17 20:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:06:23.773507846 +0000 UTC m=+132.113662197" watchObservedRunningTime="2026-04-17 20:06:23.77412186 +0000 UTC m=+132.114276213" Apr 17 20:06:24.763135 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:24.763092 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ctfd" event={"ID":"a3207e4f-83f5-4913-a57e-c29dd6aed2df","Type":"ContainerStarted","Data":"c661271467f18b66ac1afdb25fb8eb80251fd0979bdbed280fbbd0fb15876a5a"} Apr 17 20:06:24.778053 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:24.777996 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2ctfd" podStartSLOduration=131.53150058 podStartE2EDuration="2m12.777981178s" podCreationTimestamp="2026-04-17 20:04:12 +0000 UTC" firstStartedPulling="2026-04-17 20:06:22.322669903 +0000 UTC m=+130.662824246" lastFinishedPulling="2026-04-17 20:06:23.569150512 +0000 UTC m=+131.909304844" observedRunningTime="2026-04-17 20:06:24.77760946 +0000 UTC m=+133.117763811" watchObservedRunningTime="2026-04-17 20:06:24.777981178 +0000 UTC m=+133.118135528" Apr 17 20:06:25.676103 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:25.676073 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8555ccc55b-vmfxx" Apr 17 20:06:25.750408 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:25.750366 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-66d48c7756-dzkf6" Apr 17 20:06:32.754706 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:32.754664 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:32.755289 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:32.754717 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:32.759133 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:32.759108 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:32.791544 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:32.791517 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:06:32.836949 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:32.836919 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d69cfc6df-4lvmh"] Apr 17 20:06:47.051108 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:06:47.051052 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-nrnx5" podUID="ea635e92-8024-48e9-9b19-6fbeddfe380a" Apr 17 20:06:47.066407 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:06:47.066349 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qpnwl" podUID="37414adb-2a0d-4af9-93ad-64cc2ea178e7" Apr 17 20:06:47.835617 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:47.835584 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrnx5" Apr 17 20:06:51.848696 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:51.848662 2568 generic.go:358] "Generic (PLEG): container finished" podID="1980268a-97f5-4f06-8170-7ecf507eddf7" containerID="c5506394ee94c27ed46a535788feb36de99bf9e818167022a8524006a98c9596" exitCode=0 Apr 17 20:06:51.849064 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:51.848710 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-674s7" event={"ID":"1980268a-97f5-4f06-8170-7ecf507eddf7","Type":"ContainerDied","Data":"c5506394ee94c27ed46a535788feb36de99bf9e818167022a8524006a98c9596"} Apr 17 20:06:51.849064 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:51.849033 2568 scope.go:117] "RemoveContainer" containerID="c5506394ee94c27ed46a535788feb36de99bf9e818167022a8524006a98c9596" Apr 17 20:06:52.041594 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.041547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:06:52.041774 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.041621 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:06:52.044030 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.043991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ea635e92-8024-48e9-9b19-6fbeddfe380a-metrics-tls\") pod \"dns-default-nrnx5\" (UID: \"ea635e92-8024-48e9-9b19-6fbeddfe380a\") " pod="openshift-dns/dns-default-nrnx5" Apr 17 20:06:52.044161 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.044055 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/37414adb-2a0d-4af9-93ad-64cc2ea178e7-cert\") pod \"ingress-canary-qpnwl\" (UID: \"37414adb-2a0d-4af9-93ad-64cc2ea178e7\") " pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:06:52.339000 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.338968 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4cbzc\"" Apr 17 20:06:52.346927 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.346904 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrnx5" Apr 17 20:06:52.476591 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.476271 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nrnx5"] Apr 17 20:06:52.852893 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.852850 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrnx5" event={"ID":"ea635e92-8024-48e9-9b19-6fbeddfe380a","Type":"ContainerStarted","Data":"e53ce20114d01a94fd57f5eedc894e5e293abdb807cb007ab00e7fb09bb2b0a3"} Apr 17 20:06:52.854484 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:52.854455 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-674s7" event={"ID":"1980268a-97f5-4f06-8170-7ecf507eddf7","Type":"ContainerStarted","Data":"e2cd7eaece205727aef524761edf4ce4e326aec85be680c73a4e9dc706222efe"} Apr 17 20:06:54.861941 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:54.861905 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrnx5" event={"ID":"ea635e92-8024-48e9-9b19-6fbeddfe380a","Type":"ContainerStarted","Data":"ce54e584165e653d0f51e16091c6ec37914b3d9739b10caf29e150f55c8b3b61"} Apr 17 20:06:54.861941 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:54.861943 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrnx5" event={"ID":"ea635e92-8024-48e9-9b19-6fbeddfe380a","Type":"ContainerStarted","Data":"d13d8579c64b5e4233b48972ca85724843cc35a1b7f5ea4260f3f919907746b1"} Apr 17 20:06:54.862360 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:54.861977 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-nrnx5" Apr 17 20:06:54.877673 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:54.877621 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nrnx5" podStartSLOduration=130.498659794 podStartE2EDuration="2m11.877605977s" podCreationTimestamp="2026-04-17 20:04:43 +0000 UTC" firstStartedPulling="2026-04-17 20:06:52.482287224 +0000 UTC m=+160.822441553" lastFinishedPulling="2026-04-17 20:06:53.861233407 +0000 UTC m=+162.201387736" observedRunningTime="2026-04-17 20:06:54.876303867 +0000 UTC m=+163.216458218" watchObservedRunningTime="2026-04-17 20:06:54.877605977 +0000 UTC m=+163.217760328" Apr 17 20:06:57.857278 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:57.857234 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d69cfc6df-4lvmh" podUID="2bcc2590-ce6d-46f5-9bb5-44573329abe4" containerName="console" containerID="cri-o://fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702" gracePeriod=15 Apr 17 20:06:58.103948 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.103922 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d69cfc6df-4lvmh_2bcc2590-ce6d-46f5-9bb5-44573329abe4/console/0.log" Apr 17 20:06:58.104070 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.103998 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:58.193148 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193107 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-oauth-serving-cert\") pod \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " Apr 17 20:06:58.193334 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193159 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-config\") pod \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " Apr 17 20:06:58.193334 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193195 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cfkw\" (UniqueName: \"kubernetes.io/projected/2bcc2590-ce6d-46f5-9bb5-44573329abe4-kube-api-access-2cfkw\") pod \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " Apr 17 20:06:58.193334 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193223 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-serving-cert\") pod \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " Apr 17 20:06:58.193334 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193245 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-service-ca\") pod \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " Apr 17 20:06:58.193334 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193285 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-oauth-config\") pod \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\" (UID: \"2bcc2590-ce6d-46f5-9bb5-44573329abe4\") " Apr 17 20:06:58.193692 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193657 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-config" (OuterVolumeSpecName: "console-config") pod "2bcc2590-ce6d-46f5-9bb5-44573329abe4" (UID: "2bcc2590-ce6d-46f5-9bb5-44573329abe4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:06:58.193757 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193665 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2bcc2590-ce6d-46f5-9bb5-44573329abe4" (UID: "2bcc2590-ce6d-46f5-9bb5-44573329abe4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:06:58.193757 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.193735 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-service-ca" (OuterVolumeSpecName: "service-ca") pod "2bcc2590-ce6d-46f5-9bb5-44573329abe4" (UID: "2bcc2590-ce6d-46f5-9bb5-44573329abe4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:06:58.195713 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.195679 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bcc2590-ce6d-46f5-9bb5-44573329abe4-kube-api-access-2cfkw" (OuterVolumeSpecName: "kube-api-access-2cfkw") pod "2bcc2590-ce6d-46f5-9bb5-44573329abe4" (UID: "2bcc2590-ce6d-46f5-9bb5-44573329abe4"). InnerVolumeSpecName "kube-api-access-2cfkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:06:58.195713 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.195700 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2bcc2590-ce6d-46f5-9bb5-44573329abe4" (UID: "2bcc2590-ce6d-46f5-9bb5-44573329abe4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:06:58.195851 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.195711 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2bcc2590-ce6d-46f5-9bb5-44573329abe4" (UID: "2bcc2590-ce6d-46f5-9bb5-44573329abe4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:06:58.294780 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.294740 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-oauth-serving-cert\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:06:58.294780 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.294772 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-config\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:06:58.294780 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.294787 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2cfkw\" (UniqueName: \"kubernetes.io/projected/2bcc2590-ce6d-46f5-9bb5-44573329abe4-kube-api-access-2cfkw\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:06:58.295057 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.294802 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-serving-cert\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:06:58.295057 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.294814 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2bcc2590-ce6d-46f5-9bb5-44573329abe4-service-ca\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:06:58.295057 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.294826 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2bcc2590-ce6d-46f5-9bb5-44573329abe4-console-oauth-config\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:06:58.875181 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.875150 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d69cfc6df-4lvmh_2bcc2590-ce6d-46f5-9bb5-44573329abe4/console/0.log" Apr 17 20:06:58.875609 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.875196 2568 generic.go:358] "Generic (PLEG): container finished" podID="2bcc2590-ce6d-46f5-9bb5-44573329abe4" containerID="fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702" exitCode=2 Apr 17 20:06:58.875609 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.875236 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d69cfc6df-4lvmh" event={"ID":"2bcc2590-ce6d-46f5-9bb5-44573329abe4","Type":"ContainerDied","Data":"fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702"} Apr 17 20:06:58.875609 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.875275 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d69cfc6df-4lvmh" Apr 17 20:06:58.875609 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.875293 2568 scope.go:117] "RemoveContainer" containerID="fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702" Apr 17 20:06:58.875609 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.875279 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d69cfc6df-4lvmh" event={"ID":"2bcc2590-ce6d-46f5-9bb5-44573329abe4","Type":"ContainerDied","Data":"2b88ea7643d0a20ab8272c8914ef441e22264e94a13f13070c4599b7040a18ce"} Apr 17 20:06:58.883329 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.883311 2568 scope.go:117] "RemoveContainer" containerID="fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702" Apr 17 20:06:58.883631 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:06:58.883610 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702\": container with ID starting with fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702 not found: ID does not exist" containerID="fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702" Apr 17 20:06:58.883705 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.883643 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702"} err="failed to get container status \"fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702\": rpc error: code = NotFound desc = could not find container \"fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702\": container with ID starting with fd81885cae4b2814fdb6fa33afa024edcbb9a970bd7b0c7f1307c9f95cdbe702 not found: ID does not exist" Apr 17 20:06:58.891032 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.891000 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d69cfc6df-4lvmh"] Apr 17 20:06:58.894220 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:58.894198 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d69cfc6df-4lvmh"] Apr 17 20:06:59.263330 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:59.263289 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:06:59.265924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:59.265899 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tl4j2\"" Apr 17 20:06:59.273837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:59.273817 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qpnwl" Apr 17 20:06:59.399679 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:59.399655 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qpnwl"] Apr 17 20:06:59.402473 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:06:59.402448 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37414adb_2a0d_4af9_93ad_64cc2ea178e7.slice/crio-f83756bd8861e38c36ecc99f0a36afd3191cf58fc95fca8101518ee1d6a1af23 WatchSource:0}: Error finding container f83756bd8861e38c36ecc99f0a36afd3191cf58fc95fca8101518ee1d6a1af23: Status 404 returned error can't find the container with id f83756bd8861e38c36ecc99f0a36afd3191cf58fc95fca8101518ee1d6a1af23 Apr 17 20:06:59.879925 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:06:59.879890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qpnwl" event={"ID":"37414adb-2a0d-4af9-93ad-64cc2ea178e7","Type":"ContainerStarted","Data":"f83756bd8861e38c36ecc99f0a36afd3191cf58fc95fca8101518ee1d6a1af23"} Apr 17 20:07:00.268285 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:00.268246 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bcc2590-ce6d-46f5-9bb5-44573329abe4" path="/var/lib/kubelet/pods/2bcc2590-ce6d-46f5-9bb5-44573329abe4/volumes" Apr 17 20:07:01.889151 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:01.889113 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qpnwl" event={"ID":"37414adb-2a0d-4af9-93ad-64cc2ea178e7","Type":"ContainerStarted","Data":"c71541b405c34f3ab0e3dfadd72f2a2206d8b06eb8bb6fd66051d4eed09e4c93"} Apr 17 20:07:01.909015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:01.908965 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qpnwl" podStartSLOduration=137.130938361 podStartE2EDuration="2m18.908950182s" podCreationTimestamp="2026-04-17 20:04:43 +0000 UTC" firstStartedPulling="2026-04-17 20:06:59.404288112 +0000 UTC m=+167.744442441" lastFinishedPulling="2026-04-17 20:07:01.182299929 +0000 UTC m=+169.522454262" observedRunningTime="2026-04-17 20:07:01.907909582 +0000 UTC m=+170.248063947" watchObservedRunningTime="2026-04-17 20:07:01.908950182 +0000 UTC m=+170.249104609" Apr 17 20:07:04.868135 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:04.868101 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nrnx5" Apr 17 20:07:06.909307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:06.909276 2568 generic.go:358] "Generic (PLEG): container finished" podID="fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba" containerID="9a9cba4527fcb75cb92aebb3106e8cfd7f89a43a17bb1150007ff80a4afa99a0" exitCode=0 Apr 17 20:07:06.909674 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:06.909331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" event={"ID":"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba","Type":"ContainerDied","Data":"9a9cba4527fcb75cb92aebb3106e8cfd7f89a43a17bb1150007ff80a4afa99a0"} Apr 17 20:07:06.909759 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:06.909691 2568 scope.go:117] "RemoveContainer" containerID="9a9cba4527fcb75cb92aebb3106e8cfd7f89a43a17bb1150007ff80a4afa99a0" Apr 17 20:07:07.913678 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:07.913631 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-jbbnx" event={"ID":"fe56c17d-bf66-46c0-a9c6-5baf8eb3ccba","Type":"ContainerStarted","Data":"7de5760b2e008de303f106ed5b9acf34cecf059bbdbc53f5b04e20526681172a"} Apr 17 20:07:34.380630 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.380594 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:07:34.381106 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.380940 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bcc2590-ce6d-46f5-9bb5-44573329abe4" containerName="console" Apr 17 20:07:34.381106 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.380954 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcc2590-ce6d-46f5-9bb5-44573329abe4" containerName="console" Apr 17 20:07:34.381106 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.381016 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bcc2590-ce6d-46f5-9bb5-44573329abe4" containerName="console" Apr 17 20:07:34.383814 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.383795 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.386629 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.386601 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 20:07:34.386787 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.386640 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 20:07:34.386787 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.386663 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 20:07:34.386787 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.386671 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 20:07:34.386787 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.386711 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fdt85\"" Apr 17 20:07:34.386787 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.386740 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 20:07:34.387048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.386836 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 20:07:34.387103 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.387068 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 20:07:34.387158 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.387126 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 20:07:34.392498 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.392481 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 20:07:34.398068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.398039 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:07:34.514062 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514022 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-config-volume\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514062 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514061 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2bjf\" (UniqueName: \"kubernetes.io/projected/467b097d-25d9-4d5d-a793-923abf4bc77e-kube-api-access-w2bjf\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514293 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514087 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514293 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514293 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/467b097d-25d9-4d5d-a793-923abf4bc77e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514293 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514141 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-web-config\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514293 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514157 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514293 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/467b097d-25d9-4d5d-a793-923abf4bc77e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514293 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514582 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514582 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/467b097d-25d9-4d5d-a793-923abf4bc77e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514582 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514368 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/467b097d-25d9-4d5d-a793-923abf4bc77e-config-out\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.514582 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.514385 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b097d-25d9-4d5d-a793-923abf4bc77e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.615664 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.615624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.615664 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.615663 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.615885 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.615688 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/467b097d-25d9-4d5d-a793-923abf4bc77e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.615885 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.615811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-web-config\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.615885 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.615857 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616032 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.615974 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/467b097d-25d9-4d5d-a793-923abf4bc77e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616085 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.616054 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616146 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.616118 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616200 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.616146 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/467b097d-25d9-4d5d-a793-923abf4bc77e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616200 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.616177 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/467b097d-25d9-4d5d-a793-923abf4bc77e-config-out\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616316 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.616204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b097d-25d9-4d5d-a793-923abf4bc77e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616316 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.616284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-config-volume\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616316 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.616308 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2bjf\" (UniqueName: \"kubernetes.io/projected/467b097d-25d9-4d5d-a793-923abf4bc77e-kube-api-access-w2bjf\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.616562 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.616536 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/467b097d-25d9-4d5d-a793-923abf4bc77e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.618039 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.617703 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b097d-25d9-4d5d-a793-923abf4bc77e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.618804 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.618780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/467b097d-25d9-4d5d-a793-923abf4bc77e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.618914 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.618864 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.618981 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.618965 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.619074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.619053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/467b097d-25d9-4d5d-a793-923abf4bc77e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.619140 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.619053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-web-config\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.619140 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.619091 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.619340 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.619322 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.619488 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.619473 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/467b097d-25d9-4d5d-a793-923abf4bc77e-config-out\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.620224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.620209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-config-volume\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.620825 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.620804 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/467b097d-25d9-4d5d-a793-923abf4bc77e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.623979 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.623955 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2bjf\" (UniqueName: \"kubernetes.io/projected/467b097d-25d9-4d5d-a793-923abf4bc77e-kube-api-access-w2bjf\") pod \"alertmanager-main-0\" (UID: \"467b097d-25d9-4d5d-a793-923abf4bc77e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.694498 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.694464 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 20:07:34.823022 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.822976 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 20:07:34.825896 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:07:34.825867 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467b097d_25d9_4d5d_a793_923abf4bc77e.slice/crio-efd247d536acf50d01f0a8a20eb52d9a2d985c6064db4653741da1f603cda08c WatchSource:0}: Error finding container efd247d536acf50d01f0a8a20eb52d9a2d985c6064db4653741da1f603cda08c: Status 404 returned error can't find the container with id efd247d536acf50d01f0a8a20eb52d9a2d985c6064db4653741da1f603cda08c Apr 17 20:07:34.995895 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.995811 2568 generic.go:358] "Generic (PLEG): container finished" podID="467b097d-25d9-4d5d-a793-923abf4bc77e" containerID="057ceb87fdbe3581d7b2f22d63857395c2480172d754c422d32e60368eb4072b" exitCode=0 Apr 17 20:07:34.996030 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.995903 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"467b097d-25d9-4d5d-a793-923abf4bc77e","Type":"ContainerDied","Data":"057ceb87fdbe3581d7b2f22d63857395c2480172d754c422d32e60368eb4072b"} Apr 17 20:07:34.996030 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:34.995937 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"467b097d-25d9-4d5d-a793-923abf4bc77e","Type":"ContainerStarted","Data":"efd247d536acf50d01f0a8a20eb52d9a2d985c6064db4653741da1f603cda08c"} Apr 17 20:07:37.006054 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:37.006020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"467b097d-25d9-4d5d-a793-923abf4bc77e","Type":"ContainerStarted","Data":"3675a3e6a2500ec8c4c52c9a4d9d6e7ab268fb00e175b18ee71e1e0897d54823"} Apr 17 20:07:37.006054 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:37.006056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"467b097d-25d9-4d5d-a793-923abf4bc77e","Type":"ContainerStarted","Data":"240480f4878d66cc6a82871e7c2ef289862447128e7b2057043c6ac86a3eb46c"} Apr 17 20:07:37.006484 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:37.006067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"467b097d-25d9-4d5d-a793-923abf4bc77e","Type":"ContainerStarted","Data":"90a266fc9d1408f67a72af7a17cfac0c5ff8f0ffdd719323fa9614775c488410"} Apr 17 20:07:37.006484 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:37.006078 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"467b097d-25d9-4d5d-a793-923abf4bc77e","Type":"ContainerStarted","Data":"4097c5d50d1bcfd38bc6e1e2f643a6a02c3c10b896af50ae2c10403b82433ef7"} Apr 17 20:07:37.006484 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:37.006085 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"467b097d-25d9-4d5d-a793-923abf4bc77e","Type":"ContainerStarted","Data":"241bdbc32f759cfd5f618fff8bf5c4b70362c09dbec2d89c73b131539bdf7f2f"} Apr 17 20:07:37.006484 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:37.006093 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"467b097d-25d9-4d5d-a793-923abf4bc77e","Type":"ContainerStarted","Data":"281443e78b23b34b124c5e17f66e5217e5d353d51f5fa424acc7c7d54d3ed411"} Apr 17 20:07:37.031015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:37.030948 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.6039858 podStartE2EDuration="3.030927956s" podCreationTimestamp="2026-04-17 20:07:34 +0000 UTC" firstStartedPulling="2026-04-17 20:07:34.997044326 +0000 UTC m=+203.337198660" lastFinishedPulling="2026-04-17 20:07:36.423986483 +0000 UTC m=+204.764140816" observedRunningTime="2026-04-17 20:07:37.028988922 +0000 UTC m=+205.369143273" watchObservedRunningTime="2026-04-17 20:07:37.030927956 +0000 UTC m=+205.371082309" Apr 17 20:07:46.777177 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.777137 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d55486bdd-7z5gj"] Apr 17 20:07:46.781276 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.781229 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.792984 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.792951 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d55486bdd-7z5gj"] Apr 17 20:07:46.824897 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.824862 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-trusted-ca-bundle\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.825069 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.824923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-config\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.825069 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.824947 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tn66\" (UniqueName: \"kubernetes.io/projected/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-kube-api-access-2tn66\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.825069 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.824966 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-serving-cert\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.825069 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.824994 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-oauth-config\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.825214 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.825094 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-service-ca\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.825214 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.825127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-oauth-serving-cert\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.925980 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.925939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-service-ca\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.925980 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.925986 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-oauth-serving-cert\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.926199 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.926103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-trusted-ca-bundle\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.926199 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.926173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-config\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.926277 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.926197 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tn66\" (UniqueName: \"kubernetes.io/projected/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-kube-api-access-2tn66\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.926277 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.926229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-serving-cert\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.926448 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.926422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-oauth-config\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.926811 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.926774 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-service-ca\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.926811 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.926797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-oauth-serving-cert\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.927001 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.926836 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-config\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.927080 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.927055 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-trusted-ca-bundle\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.928727 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.928703 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-serving-cert\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.928727 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.928722 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-oauth-config\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:46.933583 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:46.933564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tn66\" (UniqueName: \"kubernetes.io/projected/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-kube-api-access-2tn66\") pod \"console-7d55486bdd-7z5gj\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:47.091261 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:47.091154 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:47.222846 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:47.222818 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d55486bdd-7z5gj"] Apr 17 20:07:47.225251 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:07:47.225205 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded2dcf27_72e1_455f_8c5a_3c3e6cb3767e.slice/crio-4ab94cc22a924bd275e7ddd98046256fe3c7f9ea0904ec3fbc9c455e77094e84 WatchSource:0}: Error finding container 4ab94cc22a924bd275e7ddd98046256fe3c7f9ea0904ec3fbc9c455e77094e84: Status 404 returned error can't find the container with id 4ab94cc22a924bd275e7ddd98046256fe3c7f9ea0904ec3fbc9c455e77094e84 Apr 17 20:07:48.041258 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:48.041222 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d55486bdd-7z5gj" event={"ID":"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e","Type":"ContainerStarted","Data":"c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677"} Apr 17 20:07:48.041258 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:48.041258 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d55486bdd-7z5gj" event={"ID":"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e","Type":"ContainerStarted","Data":"4ab94cc22a924bd275e7ddd98046256fe3c7f9ea0904ec3fbc9c455e77094e84"} Apr 17 20:07:48.058791 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:48.058741 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d55486bdd-7z5gj" podStartSLOduration=2.058723975 podStartE2EDuration="2.058723975s" podCreationTimestamp="2026-04-17 20:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:07:48.057122256 +0000 UTC m=+216.397276608" watchObservedRunningTime="2026-04-17 20:07:48.058723975 +0000 UTC m=+216.398878326" Apr 17 20:07:57.092201 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:57.092157 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:57.092201 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:57.092201 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:57.097073 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:57.097048 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:58.073099 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:58.073064 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:07:58.118584 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:07:58.118552 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6746c58bfc-m7bhw"] Apr 17 20:08:23.138838 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.138793 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6746c58bfc-m7bhw" podUID="35cb8783-2a10-4a90-b516-500f71ad6775" containerName="console" containerID="cri-o://5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f" gracePeriod=15 Apr 17 20:08:23.374880 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.374857 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6746c58bfc-m7bhw_35cb8783-2a10-4a90-b516-500f71ad6775/console/0.log" Apr 17 20:08:23.375009 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.374919 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:08:23.457067 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457034 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-console-config\") pod \"35cb8783-2a10-4a90-b516-500f71ad6775\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " Apr 17 20:08:23.457243 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457083 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-serving-cert\") pod \"35cb8783-2a10-4a90-b516-500f71ad6775\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " Apr 17 20:08:23.457243 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457112 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-oauth-serving-cert\") pod \"35cb8783-2a10-4a90-b516-500f71ad6775\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " Apr 17 20:08:23.457318 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457291 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-oauth-config\") pod \"35cb8783-2a10-4a90-b516-500f71ad6775\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " Apr 17 20:08:23.457354 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457336 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-trusted-ca-bundle\") pod \"35cb8783-2a10-4a90-b516-500f71ad6775\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " Apr 17 20:08:23.457433 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457364 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlcxb\" (UniqueName: \"kubernetes.io/projected/35cb8783-2a10-4a90-b516-500f71ad6775-kube-api-access-jlcxb\") pod \"35cb8783-2a10-4a90-b516-500f71ad6775\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " Apr 17 20:08:23.457495 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457463 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-service-ca\") pod \"35cb8783-2a10-4a90-b516-500f71ad6775\" (UID: \"35cb8783-2a10-4a90-b516-500f71ad6775\") " Apr 17 20:08:23.457553 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457484 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "35cb8783-2a10-4a90-b516-500f71ad6775" (UID: "35cb8783-2a10-4a90-b516-500f71ad6775"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:08:23.457553 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457536 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-console-config" (OuterVolumeSpecName: "console-config") pod "35cb8783-2a10-4a90-b516-500f71ad6775" (UID: "35cb8783-2a10-4a90-b516-500f71ad6775"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:08:23.457771 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457754 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-console-config\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:08:23.457836 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457778 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-oauth-serving-cert\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:08:23.457836 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457777 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "35cb8783-2a10-4a90-b516-500f71ad6775" (UID: "35cb8783-2a10-4a90-b516-500f71ad6775"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:08:23.457904 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.457833 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-service-ca" (OuterVolumeSpecName: "service-ca") pod "35cb8783-2a10-4a90-b516-500f71ad6775" (UID: "35cb8783-2a10-4a90-b516-500f71ad6775"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:08:23.459488 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.459460 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35cb8783-2a10-4a90-b516-500f71ad6775-kube-api-access-jlcxb" (OuterVolumeSpecName: "kube-api-access-jlcxb") pod "35cb8783-2a10-4a90-b516-500f71ad6775" (UID: "35cb8783-2a10-4a90-b516-500f71ad6775"). InnerVolumeSpecName "kube-api-access-jlcxb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:08:23.459920 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.459894 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "35cb8783-2a10-4a90-b516-500f71ad6775" (UID: "35cb8783-2a10-4a90-b516-500f71ad6775"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:08:23.459970 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.459904 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "35cb8783-2a10-4a90-b516-500f71ad6775" (UID: "35cb8783-2a10-4a90-b516-500f71ad6775"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:08:23.558777 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.558741 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-service-ca\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:08:23.558777 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.558773 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-serving-cert\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:08:23.558777 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.558783 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35cb8783-2a10-4a90-b516-500f71ad6775-console-oauth-config\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:08:23.559012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.558792 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35cb8783-2a10-4a90-b516-500f71ad6775-trusted-ca-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:08:23.559012 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:23.558802 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jlcxb\" (UniqueName: \"kubernetes.io/projected/35cb8783-2a10-4a90-b516-500f71ad6775-kube-api-access-jlcxb\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:08:24.154344 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.154316 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6746c58bfc-m7bhw_35cb8783-2a10-4a90-b516-500f71ad6775/console/0.log" Apr 17 20:08:24.154744 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.154358 2568 generic.go:358] "Generic (PLEG): container finished" podID="35cb8783-2a10-4a90-b516-500f71ad6775" containerID="5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f" exitCode=2 Apr 17 20:08:24.154744 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.154458 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6746c58bfc-m7bhw" Apr 17 20:08:24.154744 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.154457 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6746c58bfc-m7bhw" event={"ID":"35cb8783-2a10-4a90-b516-500f71ad6775","Type":"ContainerDied","Data":"5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f"} Apr 17 20:08:24.154744 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.154501 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6746c58bfc-m7bhw" event={"ID":"35cb8783-2a10-4a90-b516-500f71ad6775","Type":"ContainerDied","Data":"c8f924092a36c4c018671e24475f3aab54c897bd74d07524ed46a906b4cd8f5e"} Apr 17 20:08:24.154744 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.154524 2568 scope.go:117] "RemoveContainer" containerID="5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f" Apr 17 20:08:24.163911 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.163895 2568 scope.go:117] "RemoveContainer" containerID="5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f" Apr 17 20:08:24.164199 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:08:24.164179 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f\": container with ID starting with 5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f not found: ID does not exist" containerID="5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f" Apr 17 20:08:24.164236 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.164209 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f"} err="failed to get container status \"5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f\": rpc error: code = NotFound desc = could not find container \"5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f\": container with ID starting with 5bec3157a411caf94942bef56150d306d167463d503734e6bbc34565387f953f not found: ID does not exist" Apr 17 20:08:24.176260 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.176175 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6746c58bfc-m7bhw"] Apr 17 20:08:24.179287 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.179256 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6746c58bfc-m7bhw"] Apr 17 20:08:24.267675 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:24.267639 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35cb8783-2a10-4a90-b516-500f71ad6775" path="/var/lib/kubelet/pods/35cb8783-2a10-4a90-b516-500f71ad6775/volumes" Apr 17 20:08:48.078132 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.078041 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv"] Apr 17 20:08:48.078622 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.078603 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35cb8783-2a10-4a90-b516-500f71ad6775" containerName="console" Apr 17 20:08:48.078696 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.078625 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cb8783-2a10-4a90-b516-500f71ad6775" containerName="console" Apr 17 20:08:48.078754 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.078739 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="35cb8783-2a10-4a90-b516-500f71ad6775" containerName="console" Apr 17 20:08:48.081833 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.081808 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.084486 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.084462 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:08:48.084610 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.084540 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:08:48.085323 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.085302 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-n7smc\"" Apr 17 20:08:48.090065 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.090045 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv"] Apr 17 20:08:48.169460 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.169387 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.169636 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.169473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.169636 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.169489 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpww8\" (UniqueName: \"kubernetes.io/projected/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-kube-api-access-vpww8\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.270822 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.270789 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.271017 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.270829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.271017 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.270848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpww8\" (UniqueName: \"kubernetes.io/projected/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-kube-api-access-vpww8\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.271142 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.271126 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.271224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.271201 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.278667 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.278631 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpww8\" (UniqueName: \"kubernetes.io/projected/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-kube-api-access-vpww8\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.393161 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.393054 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:08:48.536850 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:48.536823 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv"] Apr 17 20:08:48.538821 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:08:48.538793 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85e0fd6b_01ee_4fd3_a4a6_e2f8fd218cb6.slice/crio-d67fe351f548067f427af8149c11d0cc1958301fb5cedc2f008f31674aae12b7 WatchSource:0}: Error finding container d67fe351f548067f427af8149c11d0cc1958301fb5cedc2f008f31674aae12b7: Status 404 returned error can't find the container with id d67fe351f548067f427af8149c11d0cc1958301fb5cedc2f008f31674aae12b7 Apr 17 20:08:49.235364 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:49.235326 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" event={"ID":"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6","Type":"ContainerStarted","Data":"d67fe351f548067f427af8149c11d0cc1958301fb5cedc2f008f31674aae12b7"} Apr 17 20:08:55.256342 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:55.256246 2568 generic.go:358] "Generic (PLEG): container finished" podID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerID="6f6a689bc2fc5ddc66071f89c39098d282ea42e9bfcf14e5ccb853a90b3e72b4" exitCode=0 Apr 17 20:08:55.256816 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:08:55.256334 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" event={"ID":"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6","Type":"ContainerDied","Data":"6f6a689bc2fc5ddc66071f89c39098d282ea42e9bfcf14e5ccb853a90b3e72b4"} Apr 17 20:09:05.641488 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:09:05.641439 2568 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/cert-manager/cert-manager-operator-bundle@sha256=e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908/signature-4: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908" Apr 17 20:09:05.642001 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:09:05.641637 2568 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpww8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000450000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv_openshift-marketplace(85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6): ErrImagePull: unable to pull image or OCI artifact: pull image err: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/cert-manager/cert-manager-operator-bundle@sha256=e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908/signature-4: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 20:09:05.642810 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:09:05.642780 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/cert-manager/cert-manager-operator-bundle@sha256=e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908/signature-4: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" Apr 17 20:09:06.295779 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:09:06.295741 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: reading signatures: reading signature from https://registry.redhat.io/containers/sigstore/cert-manager/cert-manager-operator-bundle@sha256=e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908/signature-4: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" Apr 17 20:09:12.142921 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:12.142884 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:09:12.145442 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:12.145416 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:09:12.147356 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:12.147326 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:09:12.150908 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:12.150704 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:09:12.153807 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:12.153786 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:09:19.266330 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:19.264222 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:09:21.347240 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:21.347204 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" event={"ID":"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6","Type":"ContainerStarted","Data":"b74baeb8ba1e4debc4b2a02340ce4878a4e1981e0b7a471a400ac14242781ecd"} Apr 17 20:09:22.351473 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:22.351441 2568 generic.go:358] "Generic (PLEG): container finished" podID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerID="b74baeb8ba1e4debc4b2a02340ce4878a4e1981e0b7a471a400ac14242781ecd" exitCode=0 Apr 17 20:09:22.351881 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:22.351512 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" event={"ID":"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6","Type":"ContainerDied","Data":"b74baeb8ba1e4debc4b2a02340ce4878a4e1981e0b7a471a400ac14242781ecd"} Apr 17 20:09:32.387316 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:32.387281 2568 generic.go:358] "Generic (PLEG): container finished" podID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerID="f71016c7ad18283c8944e856eb13c69f2b7800c025647de79afd3c8d94d406f8" exitCode=0 Apr 17 20:09:32.387711 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:32.387350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" event={"ID":"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6","Type":"ContainerDied","Data":"f71016c7ad18283c8944e856eb13c69f2b7800c025647de79afd3c8d94d406f8"} Apr 17 20:09:33.518603 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.518580 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:09:33.578685 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.578634 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-util\") pod \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " Apr 17 20:09:33.578878 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.578703 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-bundle\") pod \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " Apr 17 20:09:33.578878 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.578811 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpww8\" (UniqueName: \"kubernetes.io/projected/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-kube-api-access-vpww8\") pod \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\" (UID: \"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6\") " Apr 17 20:09:33.579342 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.579317 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-bundle" (OuterVolumeSpecName: "bundle") pod "85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" (UID: "85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:09:33.581035 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.581010 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-kube-api-access-vpww8" (OuterVolumeSpecName: "kube-api-access-vpww8") pod "85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" (UID: "85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6"). InnerVolumeSpecName "kube-api-access-vpww8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:09:33.583247 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.583226 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-util" (OuterVolumeSpecName: "util") pod "85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" (UID: "85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:09:33.679924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.679835 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpww8\" (UniqueName: \"kubernetes.io/projected/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-kube-api-access-vpww8\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:09:33.679924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.679869 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:09:33.679924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:33.679878 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:09:34.395552 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:34.395516 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" event={"ID":"85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6","Type":"ContainerDied","Data":"d67fe351f548067f427af8149c11d0cc1958301fb5cedc2f008f31674aae12b7"} Apr 17 20:09:34.395552 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:34.395556 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d67fe351f548067f427af8149c11d0cc1958301fb5cedc2f008f31674aae12b7" Apr 17 20:09:34.395749 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:34.395527 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5829xv" Apr 17 20:09:40.687935 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.687895 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq"] Apr 17 20:09:40.688307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.688255 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerName="pull" Apr 17 20:09:40.688307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.688266 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerName="pull" Apr 17 20:09:40.688307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.688279 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerName="util" Apr 17 20:09:40.688307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.688286 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerName="util" Apr 17 20:09:40.688307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.688295 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerName="extract" Apr 17 20:09:40.688307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.688301 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerName="extract" Apr 17 20:09:40.688506 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.688357 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="85e0fd6b-01ee-4fd3-a4a6-e2f8fd218cb6" containerName="extract" Apr 17 20:09:40.691649 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.691628 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" Apr 17 20:09:40.694243 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.694214 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-qk9nd\"" Apr 17 20:09:40.694243 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.694215 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 20:09:40.694455 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.694215 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:09:40.707539 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.707511 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq"] Apr 17 20:09:40.848610 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.848566 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qltd8\" (UniqueName: \"kubernetes.io/projected/a8570e09-9fa0-45e0-8e89-07208bbb1466-kube-api-access-qltd8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8wrdq\" (UID: \"a8570e09-9fa0-45e0-8e89-07208bbb1466\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" Apr 17 20:09:40.848799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.848670 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8570e09-9fa0-45e0-8e89-07208bbb1466-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8wrdq\" (UID: \"a8570e09-9fa0-45e0-8e89-07208bbb1466\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" Apr 17 20:09:40.949380 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.949280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8570e09-9fa0-45e0-8e89-07208bbb1466-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8wrdq\" (UID: \"a8570e09-9fa0-45e0-8e89-07208bbb1466\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" Apr 17 20:09:40.949557 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.949456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qltd8\" (UniqueName: \"kubernetes.io/projected/a8570e09-9fa0-45e0-8e89-07208bbb1466-kube-api-access-qltd8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8wrdq\" (UID: \"a8570e09-9fa0-45e0-8e89-07208bbb1466\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" Apr 17 20:09:40.949708 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.949690 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a8570e09-9fa0-45e0-8e89-07208bbb1466-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8wrdq\" (UID: \"a8570e09-9fa0-45e0-8e89-07208bbb1466\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" Apr 17 20:09:40.957270 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:40.957245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qltd8\" (UniqueName: \"kubernetes.io/projected/a8570e09-9fa0-45e0-8e89-07208bbb1466-kube-api-access-qltd8\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8wrdq\" (UID: \"a8570e09-9fa0-45e0-8e89-07208bbb1466\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" Apr 17 20:09:41.000870 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:41.000828 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" Apr 17 20:09:41.131605 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:41.131529 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq"] Apr 17 20:09:41.134013 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:09:41.133978 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8570e09_9fa0_45e0_8e89_07208bbb1466.slice/crio-b8471dfb5d456ad06f9096d9ce13104904392574d5bd7d61596a12739295a6e6 WatchSource:0}: Error finding container b8471dfb5d456ad06f9096d9ce13104904392574d5bd7d61596a12739295a6e6: Status 404 returned error can't find the container with id b8471dfb5d456ad06f9096d9ce13104904392574d5bd7d61596a12739295a6e6 Apr 17 20:09:41.420042 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:41.419948 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" event={"ID":"a8570e09-9fa0-45e0-8e89-07208bbb1466","Type":"ContainerStarted","Data":"b8471dfb5d456ad06f9096d9ce13104904392574d5bd7d61596a12739295a6e6"} Apr 17 20:09:44.432461 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:44.432421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" event={"ID":"a8570e09-9fa0-45e0-8e89-07208bbb1466","Type":"ContainerStarted","Data":"656d757486c137e66477f3a9623312a5daa2919474fc9ff9ff303d5edd4e901e"} Apr 17 20:09:44.453563 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:44.453503 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8wrdq" podStartSLOduration=2.032698555 podStartE2EDuration="4.45348301s" podCreationTimestamp="2026-04-17 20:09:40 +0000 UTC" firstStartedPulling="2026-04-17 20:09:41.136475092 +0000 UTC m=+329.476629421" lastFinishedPulling="2026-04-17 20:09:43.557259544 +0000 UTC m=+331.897413876" observedRunningTime="2026-04-17 20:09:44.452910697 +0000 UTC m=+332.793065049" watchObservedRunningTime="2026-04-17 20:09:44.45348301 +0000 UTC m=+332.793637361" Apr 17 20:09:45.546375 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.546337 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr"] Apr 17 20:09:45.549979 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.549963 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.552310 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.552284 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:09:45.553077 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.553057 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:09:45.553175 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.553056 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-n7smc\"" Apr 17 20:09:45.557353 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.557323 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr"] Apr 17 20:09:45.697681 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.697632 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svvd\" (UniqueName: \"kubernetes.io/projected/4923ef86-36cc-456c-8655-a65ebffc430e-kube-api-access-9svvd\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.697681 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.697686 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.697908 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.697711 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.799222 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.799120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9svvd\" (UniqueName: \"kubernetes.io/projected/4923ef86-36cc-456c-8655-a65ebffc430e-kube-api-access-9svvd\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.799222 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.799166 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.799222 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.799187 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.799671 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.799653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.799710 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.799686 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.809785 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.809746 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svvd\" (UniqueName: \"kubernetes.io/projected/4923ef86-36cc-456c-8655-a65ebffc430e-kube-api-access-9svvd\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.860797 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.860757 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:45.982822 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:45.982789 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr"] Apr 17 20:09:45.984856 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:09:45.984825 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4923ef86_36cc_456c_8655_a65ebffc430e.slice/crio-df1af90840da595ee47615eaef1869f5b03d996f1c218c4b345f049696adbf3d WatchSource:0}: Error finding container df1af90840da595ee47615eaef1869f5b03d996f1c218c4b345f049696adbf3d: Status 404 returned error can't find the container with id df1af90840da595ee47615eaef1869f5b03d996f1c218c4b345f049696adbf3d Apr 17 20:09:46.440225 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:46.440189 2568 generic.go:358] "Generic (PLEG): container finished" podID="4923ef86-36cc-456c-8655-a65ebffc430e" containerID="20586ddcc709fa5b4f5f7834806e2d6bd2fc907af044b9c683e3a1777492c7a0" exitCode=0 Apr 17 20:09:46.440415 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:46.440231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" event={"ID":"4923ef86-36cc-456c-8655-a65ebffc430e","Type":"ContainerDied","Data":"20586ddcc709fa5b4f5f7834806e2d6bd2fc907af044b9c683e3a1777492c7a0"} Apr 17 20:09:46.440415 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:46.440255 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" event={"ID":"4923ef86-36cc-456c-8655-a65ebffc430e","Type":"ContainerStarted","Data":"df1af90840da595ee47615eaef1869f5b03d996f1c218c4b345f049696adbf3d"} Apr 17 20:09:48.755582 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:48.755549 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-ftszv"] Apr 17 20:09:48.758859 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:48.758843 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" Apr 17 20:09:48.761503 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:48.761471 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 20:09:48.761618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:48.761507 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 20:09:48.762308 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:48.762288 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-hv6ff\"" Apr 17 20:09:48.766157 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:48.766128 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-ftszv"] Apr 17 20:09:48.930113 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:48.930069 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phb2j\" (UniqueName: \"kubernetes.io/projected/97ca2282-b2f6-4f3c-a079-74f9dac3c6ab-kube-api-access-phb2j\") pod \"cert-manager-cainjector-8966b78d4-ftszv\" (UID: \"97ca2282-b2f6-4f3c-a079-74f9dac3c6ab\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" Apr 17 20:09:48.930296 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:48.930129 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97ca2282-b2f6-4f3c-a079-74f9dac3c6ab-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-ftszv\" (UID: \"97ca2282-b2f6-4f3c-a079-74f9dac3c6ab\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" Apr 17 20:09:49.031259 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.031154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97ca2282-b2f6-4f3c-a079-74f9dac3c6ab-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-ftszv\" (UID: \"97ca2282-b2f6-4f3c-a079-74f9dac3c6ab\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" Apr 17 20:09:49.031479 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.031276 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phb2j\" (UniqueName: \"kubernetes.io/projected/97ca2282-b2f6-4f3c-a079-74f9dac3c6ab-kube-api-access-phb2j\") pod \"cert-manager-cainjector-8966b78d4-ftszv\" (UID: \"97ca2282-b2f6-4f3c-a079-74f9dac3c6ab\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" Apr 17 20:09:49.040264 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.040235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97ca2282-b2f6-4f3c-a079-74f9dac3c6ab-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-ftszv\" (UID: \"97ca2282-b2f6-4f3c-a079-74f9dac3c6ab\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" Apr 17 20:09:49.040387 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.040353 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phb2j\" (UniqueName: \"kubernetes.io/projected/97ca2282-b2f6-4f3c-a079-74f9dac3c6ab-kube-api-access-phb2j\") pod \"cert-manager-cainjector-8966b78d4-ftszv\" (UID: \"97ca2282-b2f6-4f3c-a079-74f9dac3c6ab\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" Apr 17 20:09:49.076258 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.076215 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" Apr 17 20:09:49.203004 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.202980 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-ftszv"] Apr 17 20:09:49.205545 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:09:49.205516 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97ca2282_b2f6_4f3c_a079_74f9dac3c6ab.slice/crio-64d4da9bd73fc427f10ef9dd3afdc3b23f499a79778e5e0d60d4b1579e9be17a WatchSource:0}: Error finding container 64d4da9bd73fc427f10ef9dd3afdc3b23f499a79778e5e0d60d4b1579e9be17a: Status 404 returned error can't find the container with id 64d4da9bd73fc427f10ef9dd3afdc3b23f499a79778e5e0d60d4b1579e9be17a Apr 17 20:09:49.452999 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.452962 2568 generic.go:358] "Generic (PLEG): container finished" podID="4923ef86-36cc-456c-8655-a65ebffc430e" containerID="7f78a2a79e39d73ae7f637975d124ba55d6dad0369e67f8bede14c3d24dac8e6" exitCode=0 Apr 17 20:09:49.453205 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.453052 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" event={"ID":"4923ef86-36cc-456c-8655-a65ebffc430e","Type":"ContainerDied","Data":"7f78a2a79e39d73ae7f637975d124ba55d6dad0369e67f8bede14c3d24dac8e6"} Apr 17 20:09:49.454219 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:49.454192 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" event={"ID":"97ca2282-b2f6-4f3c-a079-74f9dac3c6ab","Type":"ContainerStarted","Data":"64d4da9bd73fc427f10ef9dd3afdc3b23f499a79778e5e0d60d4b1579e9be17a"} Apr 17 20:09:50.459771 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:50.459731 2568 generic.go:358] "Generic (PLEG): container finished" podID="4923ef86-36cc-456c-8655-a65ebffc430e" containerID="9fc3d84e272d63169d813535d22be24535b825c5251edd91dede92881ca57ee2" exitCode=0 Apr 17 20:09:50.460213 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:50.459792 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" event={"ID":"4923ef86-36cc-456c-8655-a65ebffc430e","Type":"ContainerDied","Data":"9fc3d84e272d63169d813535d22be24535b825c5251edd91dede92881ca57ee2"} Apr 17 20:09:52.052933 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.052907 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:52.160971 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.160942 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-util\") pod \"4923ef86-36cc-456c-8655-a65ebffc430e\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " Apr 17 20:09:52.161093 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.160998 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-bundle\") pod \"4923ef86-36cc-456c-8655-a65ebffc430e\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " Apr 17 20:09:52.161093 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.161086 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9svvd\" (UniqueName: \"kubernetes.io/projected/4923ef86-36cc-456c-8655-a65ebffc430e-kube-api-access-9svvd\") pod \"4923ef86-36cc-456c-8655-a65ebffc430e\" (UID: \"4923ef86-36cc-456c-8655-a65ebffc430e\") " Apr 17 20:09:52.161375 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.161354 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-bundle" (OuterVolumeSpecName: "bundle") pod "4923ef86-36cc-456c-8655-a65ebffc430e" (UID: "4923ef86-36cc-456c-8655-a65ebffc430e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:09:52.163185 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.163160 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4923ef86-36cc-456c-8655-a65ebffc430e-kube-api-access-9svvd" (OuterVolumeSpecName: "kube-api-access-9svvd") pod "4923ef86-36cc-456c-8655-a65ebffc430e" (UID: "4923ef86-36cc-456c-8655-a65ebffc430e"). InnerVolumeSpecName "kube-api-access-9svvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:09:52.165859 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.165837 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-util" (OuterVolumeSpecName: "util") pod "4923ef86-36cc-456c-8655-a65ebffc430e" (UID: "4923ef86-36cc-456c-8655-a65ebffc430e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:09:52.261778 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.261753 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:09:52.261778 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.261778 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9svvd\" (UniqueName: \"kubernetes.io/projected/4923ef86-36cc-456c-8655-a65ebffc430e-kube-api-access-9svvd\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:09:52.261778 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.261792 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4923ef86-36cc-456c-8655-a65ebffc430e-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:09:52.469423 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.469374 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" event={"ID":"4923ef86-36cc-456c-8655-a65ebffc430e","Type":"ContainerDied","Data":"df1af90840da595ee47615eaef1869f5b03d996f1c218c4b345f049696adbf3d"} Apr 17 20:09:52.469595 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.469430 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df1af90840da595ee47615eaef1869f5b03d996f1c218c4b345f049696adbf3d" Apr 17 20:09:52.469595 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.469446 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgtqdr" Apr 17 20:09:52.470979 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.470949 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" event={"ID":"97ca2282-b2f6-4f3c-a079-74f9dac3c6ab","Type":"ContainerStarted","Data":"a929e91f0aa3b477cc782427823c2e1e6fac6389ad0216d3f54447fb74251cf5"} Apr 17 20:09:52.486458 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:09:52.486388 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-ftszv" podStartSLOduration=1.591619017 podStartE2EDuration="4.486372213s" podCreationTimestamp="2026-04-17 20:09:48 +0000 UTC" firstStartedPulling="2026-04-17 20:09:49.207368872 +0000 UTC m=+337.547523201" lastFinishedPulling="2026-04-17 20:09:52.102122069 +0000 UTC m=+340.442276397" observedRunningTime="2026-04-17 20:09:52.484808484 +0000 UTC m=+340.824962836" watchObservedRunningTime="2026-04-17 20:09:52.486372213 +0000 UTC m=+340.826526563" Apr 17 20:10:05.707949 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.707864 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-jfsrj"] Apr 17 20:10:05.708439 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.708255 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4923ef86-36cc-456c-8655-a65ebffc430e" containerName="util" Apr 17 20:10:05.708439 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.708266 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4923ef86-36cc-456c-8655-a65ebffc430e" containerName="util" Apr 17 20:10:05.708439 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.708278 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4923ef86-36cc-456c-8655-a65ebffc430e" containerName="extract" Apr 17 20:10:05.708439 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.708284 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4923ef86-36cc-456c-8655-a65ebffc430e" containerName="extract" Apr 17 20:10:05.708439 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.708301 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4923ef86-36cc-456c-8655-a65ebffc430e" containerName="pull" Apr 17 20:10:05.708439 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.708306 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4923ef86-36cc-456c-8655-a65ebffc430e" containerName="pull" Apr 17 20:10:05.708439 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.708356 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4923ef86-36cc-456c-8655-a65ebffc430e" containerName="extract" Apr 17 20:10:05.731877 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.731843 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-jfsrj"] Apr 17 20:10:05.732052 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.731911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-jfsrj" Apr 17 20:10:05.735048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.735014 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-9955j\"" Apr 17 20:10:05.773513 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.773475 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmxmx\" (UniqueName: \"kubernetes.io/projected/d27683ee-d964-41e5-a38c-b4ff4ee04850-kube-api-access-mmxmx\") pod \"cert-manager-759f64656b-jfsrj\" (UID: \"d27683ee-d964-41e5-a38c-b4ff4ee04850\") " pod="cert-manager/cert-manager-759f64656b-jfsrj" Apr 17 20:10:05.773682 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.773543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d27683ee-d964-41e5-a38c-b4ff4ee04850-bound-sa-token\") pod \"cert-manager-759f64656b-jfsrj\" (UID: \"d27683ee-d964-41e5-a38c-b4ff4ee04850\") " pod="cert-manager/cert-manager-759f64656b-jfsrj" Apr 17 20:10:05.874641 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.874591 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmxmx\" (UniqueName: \"kubernetes.io/projected/d27683ee-d964-41e5-a38c-b4ff4ee04850-kube-api-access-mmxmx\") pod \"cert-manager-759f64656b-jfsrj\" (UID: \"d27683ee-d964-41e5-a38c-b4ff4ee04850\") " pod="cert-manager/cert-manager-759f64656b-jfsrj" Apr 17 20:10:05.874835 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.874681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d27683ee-d964-41e5-a38c-b4ff4ee04850-bound-sa-token\") pod \"cert-manager-759f64656b-jfsrj\" (UID: \"d27683ee-d964-41e5-a38c-b4ff4ee04850\") " pod="cert-manager/cert-manager-759f64656b-jfsrj" Apr 17 20:10:05.882592 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.882538 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d27683ee-d964-41e5-a38c-b4ff4ee04850-bound-sa-token\") pod \"cert-manager-759f64656b-jfsrj\" (UID: \"d27683ee-d964-41e5-a38c-b4ff4ee04850\") " pod="cert-manager/cert-manager-759f64656b-jfsrj" Apr 17 20:10:05.882733 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.882712 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmxmx\" (UniqueName: \"kubernetes.io/projected/d27683ee-d964-41e5-a38c-b4ff4ee04850-kube-api-access-mmxmx\") pod \"cert-manager-759f64656b-jfsrj\" (UID: \"d27683ee-d964-41e5-a38c-b4ff4ee04850\") " pod="cert-manager/cert-manager-759f64656b-jfsrj" Apr 17 20:10:05.963276 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.963190 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt"] Apr 17 20:10:05.986993 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.986953 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt"] Apr 17 20:10:05.987145 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.987088 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:05.989616 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.989587 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-n7smc\"" Apr 17 20:10:05.989901 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.989878 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:10:05.990021 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:05.989904 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:10:06.041602 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.041570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-jfsrj" Apr 17 20:10:06.076543 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.076503 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl5kr\" (UniqueName: \"kubernetes.io/projected/706e1b9f-0306-4527-bfe8-db594fe926f3-kube-api-access-dl5kr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.076713 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.076573 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.076713 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.076639 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.165885 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.165839 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-jfsrj"] Apr 17 20:10:06.169320 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:10:06.169289 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd27683ee_d964_41e5_a38c_b4ff4ee04850.slice/crio-86b1d1f97e7ae0e04fdda467b444bc0b4b2b10a4249b69490e666f5373dc8e96 WatchSource:0}: Error finding container 86b1d1f97e7ae0e04fdda467b444bc0b4b2b10a4249b69490e666f5373dc8e96: Status 404 returned error can't find the container with id 86b1d1f97e7ae0e04fdda467b444bc0b4b2b10a4249b69490e666f5373dc8e96 Apr 17 20:10:06.177483 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.177446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.177624 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.177493 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl5kr\" (UniqueName: \"kubernetes.io/projected/706e1b9f-0306-4527-bfe8-db594fe926f3-kube-api-access-dl5kr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.177624 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.177588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.177819 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.177797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.177893 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.177875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.184876 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.184852 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl5kr\" (UniqueName: \"kubernetes.io/projected/706e1b9f-0306-4527-bfe8-db594fe926f3-kube-api-access-dl5kr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.297706 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.297674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:06.442529 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.442504 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt"] Apr 17 20:10:06.445059 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:10:06.445015 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706e1b9f_0306_4527_bfe8_db594fe926f3.slice/crio-18d122cf3cbce777369b40b8e96a7b7203b845755695851912d3c0a756f3ce42 WatchSource:0}: Error finding container 18d122cf3cbce777369b40b8e96a7b7203b845755695851912d3c0a756f3ce42: Status 404 returned error can't find the container with id 18d122cf3cbce777369b40b8e96a7b7203b845755695851912d3c0a756f3ce42 Apr 17 20:10:06.521286 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.521237 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" event={"ID":"706e1b9f-0306-4527-bfe8-db594fe926f3","Type":"ContainerStarted","Data":"18d122cf3cbce777369b40b8e96a7b7203b845755695851912d3c0a756f3ce42"} Apr 17 20:10:06.522921 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.522890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-jfsrj" event={"ID":"d27683ee-d964-41e5-a38c-b4ff4ee04850","Type":"ContainerStarted","Data":"f69a5eb84bff61e18fe5404f41f77f6526f57cd126ac40cdebb5ebfc29b04ff6"} Apr 17 20:10:06.523053 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.522929 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-jfsrj" event={"ID":"d27683ee-d964-41e5-a38c-b4ff4ee04850","Type":"ContainerStarted","Data":"86b1d1f97e7ae0e04fdda467b444bc0b4b2b10a4249b69490e666f5373dc8e96"} Apr 17 20:10:06.539167 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:06.539113 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-jfsrj" podStartSLOduration=1.53909714 podStartE2EDuration="1.53909714s" podCreationTimestamp="2026-04-17 20:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:10:06.537732405 +0000 UTC m=+354.877886754" watchObservedRunningTime="2026-04-17 20:10:06.53909714 +0000 UTC m=+354.879251490" Apr 17 20:10:07.528186 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:07.528143 2568 generic.go:358] "Generic (PLEG): container finished" podID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerID="110c7d586972b100ea219a893c58b8d4bd424b2bf3852f89cf1c38f91c797635" exitCode=0 Apr 17 20:10:07.528642 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:07.528231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" event={"ID":"706e1b9f-0306-4527-bfe8-db594fe926f3","Type":"ContainerDied","Data":"110c7d586972b100ea219a893c58b8d4bd424b2bf3852f89cf1c38f91c797635"} Apr 17 20:10:08.539330 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:08.539242 2568 generic.go:358] "Generic (PLEG): container finished" podID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerID="f7e03204a69d9533996542b15255730a3032881890e0ea421e1dadb305911a1f" exitCode=0 Apr 17 20:10:08.539330 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:08.539316 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" event={"ID":"706e1b9f-0306-4527-bfe8-db594fe926f3","Type":"ContainerDied","Data":"f7e03204a69d9533996542b15255730a3032881890e0ea421e1dadb305911a1f"} Apr 17 20:10:09.544687 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:09.544648 2568 generic.go:358] "Generic (PLEG): container finished" podID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerID="c8b98380705cef739ce1e98cbfbdb19fae7d1582d7ce1775a89ae94038c967e9" exitCode=0 Apr 17 20:10:09.545074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:09.544698 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" event={"ID":"706e1b9f-0306-4527-bfe8-db594fe926f3","Type":"ContainerDied","Data":"c8b98380705cef739ce1e98cbfbdb19fae7d1582d7ce1775a89ae94038c967e9"} Apr 17 20:10:10.677843 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.677817 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:10.720644 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.720579 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-util\") pod \"706e1b9f-0306-4527-bfe8-db594fe926f3\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " Apr 17 20:10:10.720816 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.720663 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-bundle\") pod \"706e1b9f-0306-4527-bfe8-db594fe926f3\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " Apr 17 20:10:10.720816 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.720752 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl5kr\" (UniqueName: \"kubernetes.io/projected/706e1b9f-0306-4527-bfe8-db594fe926f3-kube-api-access-dl5kr\") pod \"706e1b9f-0306-4527-bfe8-db594fe926f3\" (UID: \"706e1b9f-0306-4527-bfe8-db594fe926f3\") " Apr 17 20:10:10.721514 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.721480 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-bundle" (OuterVolumeSpecName: "bundle") pod "706e1b9f-0306-4527-bfe8-db594fe926f3" (UID: "706e1b9f-0306-4527-bfe8-db594fe926f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:10:10.722919 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.722884 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706e1b9f-0306-4527-bfe8-db594fe926f3-kube-api-access-dl5kr" (OuterVolumeSpecName: "kube-api-access-dl5kr") pod "706e1b9f-0306-4527-bfe8-db594fe926f3" (UID: "706e1b9f-0306-4527-bfe8-db594fe926f3"). InnerVolumeSpecName "kube-api-access-dl5kr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:10:10.726067 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.726031 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-util" (OuterVolumeSpecName: "util") pod "706e1b9f-0306-4527-bfe8-db594fe926f3" (UID: "706e1b9f-0306-4527-bfe8-db594fe926f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:10:10.821738 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.821660 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:10.821738 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.821691 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/706e1b9f-0306-4527-bfe8-db594fe926f3-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:10.821738 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:10.821702 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dl5kr\" (UniqueName: \"kubernetes.io/projected/706e1b9f-0306-4527-bfe8-db594fe926f3-kube-api-access-dl5kr\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:11.553284 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:11.553243 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" event={"ID":"706e1b9f-0306-4527-bfe8-db594fe926f3","Type":"ContainerDied","Data":"18d122cf3cbce777369b40b8e96a7b7203b845755695851912d3c0a756f3ce42"} Apr 17 20:10:11.553284 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:11.553274 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mgsnt" Apr 17 20:10:11.553284 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:11.553283 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d122cf3cbce777369b40b8e96a7b7203b845755695851912d3c0a756f3ce42" Apr 17 20:10:22.917679 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.917643 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn"] Apr 17 20:10:22.918048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.917992 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerName="pull" Apr 17 20:10:22.918048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.918004 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerName="pull" Apr 17 20:10:22.918048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.918027 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerName="util" Apr 17 20:10:22.918048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.918033 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerName="util" Apr 17 20:10:22.918048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.918045 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerName="extract" Apr 17 20:10:22.918048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.918051 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerName="extract" Apr 17 20:10:22.918229 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.918102 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="706e1b9f-0306-4527-bfe8-db594fe926f3" containerName="extract" Apr 17 20:10:22.923799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.923773 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:22.926769 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.926736 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:10:22.926919 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.926772 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-csxrb\"" Apr 17 20:10:22.926919 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.926783 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:10:22.926919 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.926821 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:10:22.927081 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.926942 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:10:22.939928 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:22.939863 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn"] Apr 17 20:10:23.023249 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.023217 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr"] Apr 17 20:10:23.026901 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.026883 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.030632 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.030591 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-n7smc\"" Apr 17 20:10:23.031409 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.031378 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:10:23.031894 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.031876 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:10:23.034920 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.034895 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.035015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.034937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.035015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.034979 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qggcw\" (UniqueName: \"kubernetes.io/projected/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-kube-api-access-qggcw\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.043073 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.043048 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr"] Apr 17 20:10:23.135807 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.135771 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.136015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.135827 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq57p\" (UniqueName: \"kubernetes.io/projected/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-kube-api-access-wq57p\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.136015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.135896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.136015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.135954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.136015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.135981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qggcw\" (UniqueName: \"kubernetes.io/projected/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-kube-api-access-qggcw\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.136209 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.136068 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.138428 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.138383 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.138428 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.138426 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.144318 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.144292 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qggcw\" (UniqueName: \"kubernetes.io/projected/8064ee4e-f005-441a-8dcf-9dae03e7d1b1-kube-api-access-qggcw\") pod \"opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn\" (UID: \"8064ee4e-f005-441a-8dcf-9dae03e7d1b1\") " pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.235264 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.235221 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:23.237345 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.237320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.237439 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.237422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wq57p\" (UniqueName: \"kubernetes.io/projected/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-kube-api-access-wq57p\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.237539 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.237523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.237783 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.237761 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.237860 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.237791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.247059 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.247004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq57p\" (UniqueName: \"kubernetes.io/projected/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-kube-api-access-wq57p\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.338792 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.338688 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:23.402017 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.401953 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn"] Apr 17 20:10:23.409318 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:10:23.409278 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8064ee4e_f005_441a_8dcf_9dae03e7d1b1.slice/crio-5077cec7ecd402406853b85b2177832584270ffcb3c5acce20944613724087ba WatchSource:0}: Error finding container 5077cec7ecd402406853b85b2177832584270ffcb3c5acce20944613724087ba: Status 404 returned error can't find the container with id 5077cec7ecd402406853b85b2177832584270ffcb3c5acce20944613724087ba Apr 17 20:10:23.484171 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.484134 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr"] Apr 17 20:10:23.487200 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:10:23.487129 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51ab3eee_2110_4859_8f3a_e0bf6ecfdf0f.slice/crio-5e61accb83cbbad2a4b1d5e1ab8b8c7ec82ac0d11db07271b280a0b246f4c8a6 WatchSource:0}: Error finding container 5e61accb83cbbad2a4b1d5e1ab8b8c7ec82ac0d11db07271b280a0b246f4c8a6: Status 404 returned error can't find the container with id 5e61accb83cbbad2a4b1d5e1ab8b8c7ec82ac0d11db07271b280a0b246f4c8a6 Apr 17 20:10:23.596830 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.596793 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" event={"ID":"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f","Type":"ContainerStarted","Data":"ff26d399a7e0c7f819e23d248c8a818c38f139b8a8606c761decc34b97f231ab"} Apr 17 20:10:23.597004 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.596837 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" event={"ID":"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f","Type":"ContainerStarted","Data":"5e61accb83cbbad2a4b1d5e1ab8b8c7ec82ac0d11db07271b280a0b246f4c8a6"} Apr 17 20:10:23.598052 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:23.598020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" event={"ID":"8064ee4e-f005-441a-8dcf-9dae03e7d1b1","Type":"ContainerStarted","Data":"5077cec7ecd402406853b85b2177832584270ffcb3c5acce20944613724087ba"} Apr 17 20:10:24.605034 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:24.604991 2568 generic.go:358] "Generic (PLEG): container finished" podID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerID="ff26d399a7e0c7f819e23d248c8a818c38f139b8a8606c761decc34b97f231ab" exitCode=0 Apr 17 20:10:24.605502 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:24.605075 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" event={"ID":"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f","Type":"ContainerDied","Data":"ff26d399a7e0c7f819e23d248c8a818c38f139b8a8606c761decc34b97f231ab"} Apr 17 20:10:26.615097 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:26.615061 2568 generic.go:358] "Generic (PLEG): container finished" podID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerID="911433f9de5279a8af29bf6aed5e3b231208bc33722a48014ba98222061e8400" exitCode=0 Apr 17 20:10:26.615752 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:26.615145 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" event={"ID":"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f","Type":"ContainerDied","Data":"911433f9de5279a8af29bf6aed5e3b231208bc33722a48014ba98222061e8400"} Apr 17 20:10:26.616680 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:26.616640 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" event={"ID":"8064ee4e-f005-441a-8dcf-9dae03e7d1b1","Type":"ContainerStarted","Data":"b53f6d8c592c7c29d6af65700bb662792f7ef814150f553392f6748eed71f721"} Apr 17 20:10:26.616807 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:26.616771 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:26.656242 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:26.656190 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" podStartSLOduration=2.246699745 podStartE2EDuration="4.656174852s" podCreationTimestamp="2026-04-17 20:10:22 +0000 UTC" firstStartedPulling="2026-04-17 20:10:23.411601957 +0000 UTC m=+371.751756287" lastFinishedPulling="2026-04-17 20:10:25.821077063 +0000 UTC m=+374.161231394" observedRunningTime="2026-04-17 20:10:26.655078652 +0000 UTC m=+374.995233028" watchObservedRunningTime="2026-04-17 20:10:26.656174852 +0000 UTC m=+374.996329202" Apr 17 20:10:27.623286 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:27.623251 2568 generic.go:358] "Generic (PLEG): container finished" podID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerID="87c3494ece6532d39eeb9eb350dfb03eaa11855cb0fecf90c74bf7de87e121e1" exitCode=0 Apr 17 20:10:27.623724 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:27.623338 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" event={"ID":"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f","Type":"ContainerDied","Data":"87c3494ece6532d39eeb9eb350dfb03eaa11855cb0fecf90c74bf7de87e121e1"} Apr 17 20:10:28.753526 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.753500 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:28.888378 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.887259 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq57p\" (UniqueName: \"kubernetes.io/projected/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-kube-api-access-wq57p\") pod \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " Apr 17 20:10:28.888378 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.887314 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-util\") pod \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " Apr 17 20:10:28.888378 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.887464 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-bundle\") pod \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\" (UID: \"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f\") " Apr 17 20:10:28.888960 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.888926 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-bundle" (OuterVolumeSpecName: "bundle") pod "51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" (UID: "51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:10:28.890552 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.890525 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-kube-api-access-wq57p" (OuterVolumeSpecName: "kube-api-access-wq57p") pod "51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" (UID: "51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f"). InnerVolumeSpecName "kube-api-access-wq57p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:10:28.896689 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.896656 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-util" (OuterVolumeSpecName: "util") pod "51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" (UID: "51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:10:28.988130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.988087 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:28.988130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.988124 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wq57p\" (UniqueName: \"kubernetes.io/projected/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-kube-api-access-wq57p\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:28.988130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:28.988135 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:29.631703 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:29.631664 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" event={"ID":"51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f","Type":"ContainerDied","Data":"5e61accb83cbbad2a4b1d5e1ab8b8c7ec82ac0d11db07271b280a0b246f4c8a6"} Apr 17 20:10:29.631703 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:29.631698 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e61accb83cbbad2a4b1d5e1ab8b8c7ec82ac0d11db07271b280a0b246f4c8a6" Apr 17 20:10:29.631703 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:29.631705 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9944rr" Apr 17 20:10:33.686275 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.686240 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg"] Apr 17 20:10:33.686671 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.686625 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerName="extract" Apr 17 20:10:33.686671 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.686638 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerName="extract" Apr 17 20:10:33.686671 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.686654 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerName="pull" Apr 17 20:10:33.686671 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.686661 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerName="pull" Apr 17 20:10:33.686800 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.686679 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerName="util" Apr 17 20:10:33.686800 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.686686 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerName="util" Apr 17 20:10:33.686800 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.686742 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="51ab3eee-2110-4859-8f3a-e0bf6ecfdf0f" containerName="extract" Apr 17 20:10:33.689722 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.689705 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.693270 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.693245 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:10:33.693423 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.693247 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 20:10:33.693423 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.693285 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 20:10:33.693423 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.693249 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pgrnq\"" Apr 17 20:10:33.693423 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.693342 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 20:10:33.693423 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.693385 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:10:33.698803 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.698780 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg"] Apr 17 20:10:33.829113 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.829066 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ad2e17ec-813d-417b-a08f-a8a0ac70771d-manager-config\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.829310 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.829134 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad2e17ec-813d-417b-a08f-a8a0ac70771d-metrics-cert\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.829310 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.829172 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clhq5\" (UniqueName: \"kubernetes.io/projected/ad2e17ec-813d-417b-a08f-a8a0ac70771d-kube-api-access-clhq5\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.829310 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.829200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad2e17ec-813d-417b-a08f-a8a0ac70771d-cert\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.930279 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.930232 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ad2e17ec-813d-417b-a08f-a8a0ac70771d-manager-config\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.930514 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.930298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad2e17ec-813d-417b-a08f-a8a0ac70771d-metrics-cert\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.930514 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.930344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clhq5\" (UniqueName: \"kubernetes.io/projected/ad2e17ec-813d-417b-a08f-a8a0ac70771d-kube-api-access-clhq5\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.930514 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.930380 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad2e17ec-813d-417b-a08f-a8a0ac70771d-cert\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.930906 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.930881 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ad2e17ec-813d-417b-a08f-a8a0ac70771d-manager-config\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.932801 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.932771 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad2e17ec-813d-417b-a08f-a8a0ac70771d-metrics-cert\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.932886 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.932844 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ad2e17ec-813d-417b-a08f-a8a0ac70771d-cert\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.943063 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.943005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clhq5\" (UniqueName: \"kubernetes.io/projected/ad2e17ec-813d-417b-a08f-a8a0ac70771d-kube-api-access-clhq5\") pod \"lws-controller-manager-fcb6f8ffb-qgsgg\" (UID: \"ad2e17ec-813d-417b-a08f-a8a0ac70771d\") " pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:33.999686 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:33.999640 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:34.134670 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:34.134638 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg"] Apr 17 20:10:34.136561 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:10:34.136531 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad2e17ec_813d_417b_a08f_a8a0ac70771d.slice/crio-cbda8941730bc0f7c2dfc30679471f49a3377b8d9a45c2ea36b34948fcc25b3e WatchSource:0}: Error finding container cbda8941730bc0f7c2dfc30679471f49a3377b8d9a45c2ea36b34948fcc25b3e: Status 404 returned error can't find the container with id cbda8941730bc0f7c2dfc30679471f49a3377b8d9a45c2ea36b34948fcc25b3e Apr 17 20:10:34.649488 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:34.649447 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" event={"ID":"ad2e17ec-813d-417b-a08f-a8a0ac70771d","Type":"ContainerStarted","Data":"cbda8941730bc0f7c2dfc30679471f49a3377b8d9a45c2ea36b34948fcc25b3e"} Apr 17 20:10:36.659413 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:36.659366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" event={"ID":"ad2e17ec-813d-417b-a08f-a8a0ac70771d","Type":"ContainerStarted","Data":"f11a7cdd6f4a953a12dd23eb5df8a87404c4fb27b8079472b45709677b12e557"} Apr 17 20:10:36.659893 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:36.659452 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:36.676704 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:36.676651 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" podStartSLOduration=2.059666506 podStartE2EDuration="3.676634579s" podCreationTimestamp="2026-04-17 20:10:33 +0000 UTC" firstStartedPulling="2026-04-17 20:10:34.138256134 +0000 UTC m=+382.478410464" lastFinishedPulling="2026-04-17 20:10:35.755224204 +0000 UTC m=+384.095378537" observedRunningTime="2026-04-17 20:10:36.674387371 +0000 UTC m=+385.014541724" watchObservedRunningTime="2026-04-17 20:10:36.676634579 +0000 UTC m=+385.016788930" Apr 17 20:10:37.625817 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:37.625788 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn" Apr 17 20:10:39.893374 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.893339 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8"] Apr 17 20:10:39.897507 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.897487 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:39.899946 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.899923 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:10:39.900735 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.900720 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-n7smc\"" Apr 17 20:10:39.900799 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.900748 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:10:39.909989 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.909955 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8"] Apr 17 20:10:39.997580 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.997546 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtzzs\" (UniqueName: \"kubernetes.io/projected/3095cd68-bb43-47d9-8db5-2d890d33d970-kube-api-access-qtzzs\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:39.997756 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.997595 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:39.997756 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:39.997620 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:40.098692 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.098655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtzzs\" (UniqueName: \"kubernetes.io/projected/3095cd68-bb43-47d9-8db5-2d890d33d970-kube-api-access-qtzzs\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:40.098882 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.098701 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:40.098882 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.098723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:40.099045 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.099029 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:40.099104 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.099088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:40.107637 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.107606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtzzs\" (UniqueName: \"kubernetes.io/projected/3095cd68-bb43-47d9-8db5-2d890d33d970-kube-api-access-qtzzs\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:40.207853 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.207817 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:40.334156 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.334129 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8"] Apr 17 20:10:40.335826 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:10:40.335796 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3095cd68_bb43_47d9_8db5_2d890d33d970.slice/crio-6fbf8b71d4a4e979e4f4a3e6a60f8a59238e7073571ac531194766fb94778e1f WatchSource:0}: Error finding container 6fbf8b71d4a4e979e4f4a3e6a60f8a59238e7073571ac531194766fb94778e1f: Status 404 returned error can't find the container with id 6fbf8b71d4a4e979e4f4a3e6a60f8a59238e7073571ac531194766fb94778e1f Apr 17 20:10:40.676030 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.675937 2568 generic.go:358] "Generic (PLEG): container finished" podID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerID="9e0d30a2124e81d4006d043d2b6fd0fdd10960c19d718b9dd789f9ae3ff2b9cc" exitCode=0 Apr 17 20:10:40.676030 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.676018 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" event={"ID":"3095cd68-bb43-47d9-8db5-2d890d33d970","Type":"ContainerDied","Data":"9e0d30a2124e81d4006d043d2b6fd0fdd10960c19d718b9dd789f9ae3ff2b9cc"} Apr 17 20:10:40.676204 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:40.676056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" event={"ID":"3095cd68-bb43-47d9-8db5-2d890d33d970","Type":"ContainerStarted","Data":"6fbf8b71d4a4e979e4f4a3e6a60f8a59238e7073571ac531194766fb94778e1f"} Apr 17 20:10:41.682052 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:41.681965 2568 generic.go:358] "Generic (PLEG): container finished" podID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerID="07bedf78ee5104d6010e735ee541bdea78ff5c1184e4be4cefb78e129f9d8b94" exitCode=0 Apr 17 20:10:41.682437 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:41.682052 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" event={"ID":"3095cd68-bb43-47d9-8db5-2d890d33d970","Type":"ContainerDied","Data":"07bedf78ee5104d6010e735ee541bdea78ff5c1184e4be4cefb78e129f9d8b94"} Apr 17 20:10:42.688450 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:42.688416 2568 generic.go:358] "Generic (PLEG): container finished" podID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerID="609c50995288f4598036cdc7db92099b479b0604eb2a18449af4ea07d17c908e" exitCode=0 Apr 17 20:10:42.688840 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:42.688502 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" event={"ID":"3095cd68-bb43-47d9-8db5-2d890d33d970","Type":"ContainerDied","Data":"609c50995288f4598036cdc7db92099b479b0604eb2a18449af4ea07d17c908e"} Apr 17 20:10:43.828233 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.828206 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:43.831850 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.831827 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtzzs\" (UniqueName: \"kubernetes.io/projected/3095cd68-bb43-47d9-8db5-2d890d33d970-kube-api-access-qtzzs\") pod \"3095cd68-bb43-47d9-8db5-2d890d33d970\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " Apr 17 20:10:43.831986 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.831896 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-bundle\") pod \"3095cd68-bb43-47d9-8db5-2d890d33d970\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " Apr 17 20:10:43.831986 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.831934 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-util\") pod \"3095cd68-bb43-47d9-8db5-2d890d33d970\" (UID: \"3095cd68-bb43-47d9-8db5-2d890d33d970\") " Apr 17 20:10:43.832808 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.832782 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-bundle" (OuterVolumeSpecName: "bundle") pod "3095cd68-bb43-47d9-8db5-2d890d33d970" (UID: "3095cd68-bb43-47d9-8db5-2d890d33d970"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:10:43.833839 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.833813 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3095cd68-bb43-47d9-8db5-2d890d33d970-kube-api-access-qtzzs" (OuterVolumeSpecName: "kube-api-access-qtzzs") pod "3095cd68-bb43-47d9-8db5-2d890d33d970" (UID: "3095cd68-bb43-47d9-8db5-2d890d33d970"). InnerVolumeSpecName "kube-api-access-qtzzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:10:43.837277 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.837253 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-util" (OuterVolumeSpecName: "util") pod "3095cd68-bb43-47d9-8db5-2d890d33d970" (UID: "3095cd68-bb43-47d9-8db5-2d890d33d970"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:10:43.932826 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.932788 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:43.932826 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.932822 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3095cd68-bb43-47d9-8db5-2d890d33d970-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:43.932826 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:43.932833 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qtzzs\" (UniqueName: \"kubernetes.io/projected/3095cd68-bb43-47d9-8db5-2d890d33d970-kube-api-access-qtzzs\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:44.698036 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:44.698003 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" Apr 17 20:10:44.698036 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:44.698009 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835gphz8" event={"ID":"3095cd68-bb43-47d9-8db5-2d890d33d970","Type":"ContainerDied","Data":"6fbf8b71d4a4e979e4f4a3e6a60f8a59238e7073571ac531194766fb94778e1f"} Apr 17 20:10:44.698239 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:44.698049 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fbf8b71d4a4e979e4f4a3e6a60f8a59238e7073571ac531194766fb94778e1f" Apr 17 20:10:47.666080 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:47.666051 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fcb6f8ffb-qgsgg" Apr 17 20:10:49.107726 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.107635 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr"] Apr 17 20:10:49.108074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.107996 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerName="extract" Apr 17 20:10:49.108074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.108007 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerName="extract" Apr 17 20:10:49.108074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.108026 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerName="pull" Apr 17 20:10:49.108074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.108031 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerName="pull" Apr 17 20:10:49.108074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.108047 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerName="util" Apr 17 20:10:49.108074 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.108053 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerName="util" Apr 17 20:10:49.108240 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.108119 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3095cd68-bb43-47d9-8db5-2d890d33d970" containerName="extract" Apr 17 20:10:49.112611 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.112586 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.116983 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.116958 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-n7smc\"" Apr 17 20:10:49.117129 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.116998 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 20:10:49.117270 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.117255 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 20:10:49.138573 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.138536 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr"] Apr 17 20:10:49.172330 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.172278 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.172552 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.172358 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9dl8\" (UniqueName: \"kubernetes.io/projected/f92fa499-17a9-4991-bd5c-677425b4b34e-kube-api-access-t9dl8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.172552 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.172381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.273068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.273023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.273239 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.273087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9dl8\" (UniqueName: \"kubernetes.io/projected/f92fa499-17a9-4991-bd5c-677425b4b34e-kube-api-access-t9dl8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.273239 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.273107 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.273443 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.273425 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.273497 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.273481 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.287651 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.287612 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9dl8\" (UniqueName: \"kubernetes.io/projected/f92fa499-17a9-4991-bd5c-677425b4b34e-kube-api-access-t9dl8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.423095 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.422996 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:49.568426 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.568376 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr"] Apr 17 20:10:49.570368 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:10:49.570335 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf92fa499_17a9_4991_bd5c_677425b4b34e.slice/crio-2ccd9c2ba7b3d0ca36c3633eb4768772e9f7cd217ff989b64be99ef969d98e03 WatchSource:0}: Error finding container 2ccd9c2ba7b3d0ca36c3633eb4768772e9f7cd217ff989b64be99ef969d98e03: Status 404 returned error can't find the container with id 2ccd9c2ba7b3d0ca36c3633eb4768772e9f7cd217ff989b64be99ef969d98e03 Apr 17 20:10:49.732247 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.732213 2568 generic.go:358] "Generic (PLEG): container finished" podID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerID="34a79f013c56189e88327304c60c24366c1f4df0996bfc4eba2aa67a179ebb09" exitCode=0 Apr 17 20:10:49.732469 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.732303 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" event={"ID":"f92fa499-17a9-4991-bd5c-677425b4b34e","Type":"ContainerDied","Data":"34a79f013c56189e88327304c60c24366c1f4df0996bfc4eba2aa67a179ebb09"} Apr 17 20:10:49.732469 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:49.732339 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" event={"ID":"f92fa499-17a9-4991-bd5c-677425b4b34e","Type":"ContainerStarted","Data":"2ccd9c2ba7b3d0ca36c3633eb4768772e9f7cd217ff989b64be99ef969d98e03"} Apr 17 20:10:51.741558 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:51.741522 2568 generic.go:358] "Generic (PLEG): container finished" podID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerID="4864df2a8c4dbf37d900fec7a5cc8a10bf8c165150003e9cc61b0eebf5d121f9" exitCode=0 Apr 17 20:10:51.742039 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:51.741610 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" event={"ID":"f92fa499-17a9-4991-bd5c-677425b4b34e","Type":"ContainerDied","Data":"4864df2a8c4dbf37d900fec7a5cc8a10bf8c165150003e9cc61b0eebf5d121f9"} Apr 17 20:10:52.747851 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:52.747814 2568 generic.go:358] "Generic (PLEG): container finished" podID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerID="6a9e095e8d37d2cd89de2bc348e3977173363417a858b03f2d0a73efd451e9fe" exitCode=0 Apr 17 20:10:52.748228 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:52.747903 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" event={"ID":"f92fa499-17a9-4991-bd5c-677425b4b34e","Type":"ContainerDied","Data":"6a9e095e8d37d2cd89de2bc348e3977173363417a858b03f2d0a73efd451e9fe"} Apr 17 20:10:53.875432 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:53.875373 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:53.913579 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:53.913545 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-bundle\") pod \"f92fa499-17a9-4991-bd5c-677425b4b34e\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " Apr 17 20:10:53.913739 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:53.913633 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9dl8\" (UniqueName: \"kubernetes.io/projected/f92fa499-17a9-4991-bd5c-677425b4b34e-kube-api-access-t9dl8\") pod \"f92fa499-17a9-4991-bd5c-677425b4b34e\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " Apr 17 20:10:53.913739 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:53.913682 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-util\") pod \"f92fa499-17a9-4991-bd5c-677425b4b34e\" (UID: \"f92fa499-17a9-4991-bd5c-677425b4b34e\") " Apr 17 20:10:53.914770 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:53.914733 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-bundle" (OuterVolumeSpecName: "bundle") pod "f92fa499-17a9-4991-bd5c-677425b4b34e" (UID: "f92fa499-17a9-4991-bd5c-677425b4b34e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:10:53.915809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:53.915775 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92fa499-17a9-4991-bd5c-677425b4b34e-kube-api-access-t9dl8" (OuterVolumeSpecName: "kube-api-access-t9dl8") pod "f92fa499-17a9-4991-bd5c-677425b4b34e" (UID: "f92fa499-17a9-4991-bd5c-677425b4b34e"). InnerVolumeSpecName "kube-api-access-t9dl8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:10:53.921948 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:53.921918 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-util" (OuterVolumeSpecName: "util") pod "f92fa499-17a9-4991-bd5c-677425b4b34e" (UID: "f92fa499-17a9-4991-bd5c-677425b4b34e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:10:54.014837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:54.014742 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:54.014837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:54.014776 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f92fa499-17a9-4991-bd5c-677425b4b34e-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:54.014837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:54.014786 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t9dl8\" (UniqueName: \"kubernetes.io/projected/f92fa499-17a9-4991-bd5c-677425b4b34e-kube-api-access-t9dl8\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:10:54.756690 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:54.756654 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" event={"ID":"f92fa499-17a9-4991-bd5c-677425b4b34e","Type":"ContainerDied","Data":"2ccd9c2ba7b3d0ca36c3633eb4768772e9f7cd217ff989b64be99ef969d98e03"} Apr 17 20:10:54.756690 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:54.756684 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c24sqfr" Apr 17 20:10:54.756906 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:10:54.756691 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ccd9c2ba7b3d0ca36c3633eb4768772e9f7cd217ff989b64be99ef969d98e03" Apr 17 20:11:02.379932 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.379883 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2"] Apr 17 20:11:02.380497 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.380475 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerName="pull" Apr 17 20:11:02.380572 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.380499 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerName="pull" Apr 17 20:11:02.380572 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.380544 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerName="util" Apr 17 20:11:02.380572 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.380555 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerName="util" Apr 17 20:11:02.380572 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.380566 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerName="extract" Apr 17 20:11:02.380690 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.380576 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerName="extract" Apr 17 20:11:02.380690 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.380683 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f92fa499-17a9-4991-bd5c-677425b4b34e" containerName="extract" Apr 17 20:11:02.384076 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.384055 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.386817 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.386785 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:11:02.386962 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.386816 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:11:02.387155 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.387133 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 20:11:02.387277 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.387133 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-dc78l\"" Apr 17 20:11:02.394161 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.394138 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2"] Apr 17 20:11:02.487973 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.487929 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.487973 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.487976 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb86n\" (UniqueName: \"kubernetes.io/projected/17c4a279-6c87-455d-aa72-5a7d05af451b-kube-api-access-sb86n\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.488224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.488001 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.488224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.488062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.488224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.488096 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/17c4a279-6c87-455d-aa72-5a7d05af451b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.488224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.488121 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.488224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.488145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.488224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.488185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.488224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.488221 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589153 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589317 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589175 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589317 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589200 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb86n\" (UniqueName: \"kubernetes.io/projected/17c4a279-6c87-455d-aa72-5a7d05af451b-kube-api-access-sb86n\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589317 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589223 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589317 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589317 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589285 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/17c4a279-6c87-455d-aa72-5a7d05af451b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589613 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589319 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589613 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589613 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589383 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589773 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589672 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589868 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589842 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.589956 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589847 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.590019 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.589992 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.590116 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.590094 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/17c4a279-6c87-455d-aa72-5a7d05af451b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.592015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.591987 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.592110 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.592094 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.597302 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.597279 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/17c4a279-6c87-455d-aa72-5a7d05af451b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.597487 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.597432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb86n\" (UniqueName: \"kubernetes.io/projected/17c4a279-6c87-455d-aa72-5a7d05af451b-kube-api-access-sb86n\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f5wwf2\" (UID: \"17c4a279-6c87-455d-aa72-5a7d05af451b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.698278 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.698235 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:02.829176 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:02.829139 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2"] Apr 17 20:11:02.830654 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:11:02.830622 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c4a279_6c87_455d_aa72_5a7d05af451b.slice/crio-857af8e01c739dc225254a1a89784530fdef56601d5b1445b68863c6edd7b1f8 WatchSource:0}: Error finding container 857af8e01c739dc225254a1a89784530fdef56601d5b1445b68863c6edd7b1f8: Status 404 returned error can't find the container with id 857af8e01c739dc225254a1a89784530fdef56601d5b1445b68863c6edd7b1f8 Apr 17 20:11:03.792947 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:03.792901 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" event={"ID":"17c4a279-6c87-455d-aa72-5a7d05af451b","Type":"ContainerStarted","Data":"857af8e01c739dc225254a1a89784530fdef56601d5b1445b68863c6edd7b1f8"} Apr 17 20:11:05.374564 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:05.374519 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 20:11:05.374823 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:05.374619 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 20:11:05.374823 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:05.374652 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 20:11:05.803084 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:05.803044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" event={"ID":"17c4a279-6c87-455d-aa72-5a7d05af451b","Type":"ContainerStarted","Data":"dd6b5c0514975769063b8e470d5bf1f97ed10afc9a20e35fc67ce5c5a7f1bfb8"} Apr 17 20:11:05.823266 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:05.823194 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" podStartSLOduration=1.281538765 podStartE2EDuration="3.823176483s" podCreationTimestamp="2026-04-17 20:11:02 +0000 UTC" firstStartedPulling="2026-04-17 20:11:02.83255914 +0000 UTC m=+411.172713472" lastFinishedPulling="2026-04-17 20:11:05.374196847 +0000 UTC m=+413.714351190" observedRunningTime="2026-04-17 20:11:05.821372852 +0000 UTC m=+414.161527218" watchObservedRunningTime="2026-04-17 20:11:05.823176483 +0000 UTC m=+414.163330836" Apr 17 20:11:06.698617 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:06.698571 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:06.703685 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:06.703658 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:06.807032 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:06.807003 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:06.808219 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:06.808192 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f5wwf2" Apr 17 20:11:40.847798 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:40.847709 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2f94f"] Apr 17 20:11:40.853508 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:40.853484 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" Apr 17 20:11:40.855942 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:40.855911 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-dq2vf\"" Apr 17 20:11:40.856103 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:40.856008 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 20:11:40.856103 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:40.856085 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 20:11:40.864757 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:40.864732 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2f94f"] Apr 17 20:11:40.935690 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:40.935642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnmx\" (UniqueName: \"kubernetes.io/projected/d4624ab8-c2f5-48d8-98c0-89cb007e406c-kube-api-access-psnmx\") pod \"kuadrant-operator-catalog-2f94f\" (UID: \"d4624ab8-c2f5-48d8-98c0-89cb007e406c\") " pod="kuadrant-system/kuadrant-operator-catalog-2f94f" Apr 17 20:11:41.036367 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.036333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psnmx\" (UniqueName: \"kubernetes.io/projected/d4624ab8-c2f5-48d8-98c0-89cb007e406c-kube-api-access-psnmx\") pod \"kuadrant-operator-catalog-2f94f\" (UID: \"d4624ab8-c2f5-48d8-98c0-89cb007e406c\") " pod="kuadrant-system/kuadrant-operator-catalog-2f94f" Apr 17 20:11:41.044122 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.044068 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnmx\" (UniqueName: \"kubernetes.io/projected/d4624ab8-c2f5-48d8-98c0-89cb007e406c-kube-api-access-psnmx\") pod \"kuadrant-operator-catalog-2f94f\" (UID: \"d4624ab8-c2f5-48d8-98c0-89cb007e406c\") " pod="kuadrant-system/kuadrant-operator-catalog-2f94f" Apr 17 20:11:41.164240 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.164145 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" Apr 17 20:11:41.217928 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.217887 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2f94f"] Apr 17 20:11:41.293862 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.293828 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2f94f"] Apr 17 20:11:41.295336 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:11:41.295307 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4624ab8_c2f5_48d8_98c0_89cb007e406c.slice/crio-f26f00cf008b35713fae25348a71850113821473a7dda9bc3049cf0e12138f85 WatchSource:0}: Error finding container f26f00cf008b35713fae25348a71850113821473a7dda9bc3049cf0e12138f85: Status 404 returned error can't find the container with id f26f00cf008b35713fae25348a71850113821473a7dda9bc3049cf0e12138f85 Apr 17 20:11:41.425416 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.425317 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-xd8dj"] Apr 17 20:11:41.428810 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.428791 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:41.436060 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.436017 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-xd8dj"] Apr 17 20:11:41.440554 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.440531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5cjw\" (UniqueName: \"kubernetes.io/projected/ed2e018a-ad4f-479f-9a8c-b68f680c7ca5-kube-api-access-t5cjw\") pod \"kuadrant-operator-catalog-xd8dj\" (UID: \"ed2e018a-ad4f-479f-9a8c-b68f680c7ca5\") " pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:41.541282 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.541240 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5cjw\" (UniqueName: \"kubernetes.io/projected/ed2e018a-ad4f-479f-9a8c-b68f680c7ca5-kube-api-access-t5cjw\") pod \"kuadrant-operator-catalog-xd8dj\" (UID: \"ed2e018a-ad4f-479f-9a8c-b68f680c7ca5\") " pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:41.549377 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.549339 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5cjw\" (UniqueName: \"kubernetes.io/projected/ed2e018a-ad4f-479f-9a8c-b68f680c7ca5-kube-api-access-t5cjw\") pod \"kuadrant-operator-catalog-xd8dj\" (UID: \"ed2e018a-ad4f-479f-9a8c-b68f680c7ca5\") " pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:41.739741 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.739705 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:41.873963 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.873930 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-xd8dj"] Apr 17 20:11:41.898823 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:11:41.898786 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded2e018a_ad4f_479f_9a8c_b68f680c7ca5.slice/crio-0fac584e3984034f4681fb7b8fdbdde47e60b586be5125c2b4742fb6cc161a0e WatchSource:0}: Error finding container 0fac584e3984034f4681fb7b8fdbdde47e60b586be5125c2b4742fb6cc161a0e: Status 404 returned error can't find the container with id 0fac584e3984034f4681fb7b8fdbdde47e60b586be5125c2b4742fb6cc161a0e Apr 17 20:11:41.940203 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.940141 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" event={"ID":"ed2e018a-ad4f-479f-9a8c-b68f680c7ca5","Type":"ContainerStarted","Data":"0fac584e3984034f4681fb7b8fdbdde47e60b586be5125c2b4742fb6cc161a0e"} Apr 17 20:11:41.941441 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:41.941387 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" event={"ID":"d4624ab8-c2f5-48d8-98c0-89cb007e406c","Type":"ContainerStarted","Data":"f26f00cf008b35713fae25348a71850113821473a7dda9bc3049cf0e12138f85"} Apr 17 20:11:42.378121 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.378071 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7447894dd7-6jt4x"] Apr 17 20:11:42.385278 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.384476 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.393420 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.393298 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7447894dd7-6jt4x"] Apr 17 20:11:42.450141 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.449750 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-oauth-serving-cert\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.450141 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.449827 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-serving-cert\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.450141 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.449854 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-trusted-ca-bundle\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.450141 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.449913 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-oauth-config\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.450141 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.449945 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-config\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.450141 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.449971 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-service-ca\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.450141 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.449997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhpsg\" (UniqueName: \"kubernetes.io/projected/8c2d5320-08a1-4719-ae72-c3c958a1b866-kube-api-access-mhpsg\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.550621 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.550574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-oauth-config\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.550621 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.550623 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-config\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.550925 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.550654 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-service-ca\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.550925 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.550679 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhpsg\" (UniqueName: \"kubernetes.io/projected/8c2d5320-08a1-4719-ae72-c3c958a1b866-kube-api-access-mhpsg\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.550925 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.550754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-oauth-serving-cert\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.550925 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.550789 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-serving-cert\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.550925 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.550810 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-trusted-ca-bundle\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.551549 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.551487 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-config\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.551687 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.551632 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-trusted-ca-bundle\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.552060 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.552036 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-oauth-serving-cert\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.552237 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.552208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c2d5320-08a1-4719-ae72-c3c958a1b866-service-ca\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.554174 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.554150 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-serving-cert\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.554492 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.554466 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8c2d5320-08a1-4719-ae72-c3c958a1b866-console-oauth-config\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.560201 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.560178 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhpsg\" (UniqueName: \"kubernetes.io/projected/8c2d5320-08a1-4719-ae72-c3c958a1b866-kube-api-access-mhpsg\") pod \"console-7447894dd7-6jt4x\" (UID: \"8c2d5320-08a1-4719-ae72-c3c958a1b866\") " pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.704015 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.703975 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:42.870365 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:42.870324 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7447894dd7-6jt4x"] Apr 17 20:11:43.249259 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:11:43.249216 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2d5320_08a1_4719_ae72_c3c958a1b866.slice/crio-0939afc46566b403fde052c109f79ce169fab9ccbed7b3851ad5b53cccd7128a WatchSource:0}: Error finding container 0939afc46566b403fde052c109f79ce169fab9ccbed7b3851ad5b53cccd7128a: Status 404 returned error can't find the container with id 0939afc46566b403fde052c109f79ce169fab9ccbed7b3851ad5b53cccd7128a Apr 17 20:11:43.951217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:43.951169 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" event={"ID":"ed2e018a-ad4f-479f-9a8c-b68f680c7ca5","Type":"ContainerStarted","Data":"df91c9ee450d37dd44dc472cecaa527dbf8d39fe403ae2002648a87fba5d07c4"} Apr 17 20:11:43.952580 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:43.952549 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" event={"ID":"d4624ab8-c2f5-48d8-98c0-89cb007e406c","Type":"ContainerStarted","Data":"0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842"} Apr 17 20:11:43.952713 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:43.952593 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" podUID="d4624ab8-c2f5-48d8-98c0-89cb007e406c" containerName="registry-server" containerID="cri-o://0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842" gracePeriod=2 Apr 17 20:11:43.953988 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:43.953963 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7447894dd7-6jt4x" event={"ID":"8c2d5320-08a1-4719-ae72-c3c958a1b866","Type":"ContainerStarted","Data":"97eb4a1aa8e900b581f009bda051b2ae80d10e6e129d3b34e55a6a185214a809"} Apr 17 20:11:43.953988 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:43.953992 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7447894dd7-6jt4x" event={"ID":"8c2d5320-08a1-4719-ae72-c3c958a1b866","Type":"ContainerStarted","Data":"0939afc46566b403fde052c109f79ce169fab9ccbed7b3851ad5b53cccd7128a"} Apr 17 20:11:43.967669 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:43.967624 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" podStartSLOduration=1.5076236490000001 podStartE2EDuration="2.967608315s" podCreationTimestamp="2026-04-17 20:11:41 +0000 UTC" firstStartedPulling="2026-04-17 20:11:41.900361225 +0000 UTC m=+450.240515554" lastFinishedPulling="2026-04-17 20:11:43.360345887 +0000 UTC m=+451.700500220" observedRunningTime="2026-04-17 20:11:43.964815293 +0000 UTC m=+452.304969655" watchObservedRunningTime="2026-04-17 20:11:43.967608315 +0000 UTC m=+452.307762666" Apr 17 20:11:43.979130 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:43.979084 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" podStartSLOduration=1.9188508469999999 podStartE2EDuration="3.979068699s" podCreationTimestamp="2026-04-17 20:11:40 +0000 UTC" firstStartedPulling="2026-04-17 20:11:41.29657003 +0000 UTC m=+449.636724359" lastFinishedPulling="2026-04-17 20:11:43.356787871 +0000 UTC m=+451.696942211" observedRunningTime="2026-04-17 20:11:43.978106046 +0000 UTC m=+452.318260398" watchObservedRunningTime="2026-04-17 20:11:43.979068699 +0000 UTC m=+452.319223049" Apr 17 20:11:43.994035 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:43.993990 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7447894dd7-6jt4x" podStartSLOduration=1.99397436 podStartE2EDuration="1.99397436s" podCreationTimestamp="2026-04-17 20:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:11:43.992564013 +0000 UTC m=+452.332718365" watchObservedRunningTime="2026-04-17 20:11:43.99397436 +0000 UTC m=+452.334128710" Apr 17 20:11:44.182733 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.182707 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" Apr 17 20:11:44.269131 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.269101 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psnmx\" (UniqueName: \"kubernetes.io/projected/d4624ab8-c2f5-48d8-98c0-89cb007e406c-kube-api-access-psnmx\") pod \"d4624ab8-c2f5-48d8-98c0-89cb007e406c\" (UID: \"d4624ab8-c2f5-48d8-98c0-89cb007e406c\") " Apr 17 20:11:44.271305 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.271275 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4624ab8-c2f5-48d8-98c0-89cb007e406c-kube-api-access-psnmx" (OuterVolumeSpecName: "kube-api-access-psnmx") pod "d4624ab8-c2f5-48d8-98c0-89cb007e406c" (UID: "d4624ab8-c2f5-48d8-98c0-89cb007e406c"). InnerVolumeSpecName "kube-api-access-psnmx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:11:44.369911 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.369860 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-psnmx\" (UniqueName: \"kubernetes.io/projected/d4624ab8-c2f5-48d8-98c0-89cb007e406c-kube-api-access-psnmx\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:11:44.958911 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.958876 2568 generic.go:358] "Generic (PLEG): container finished" podID="d4624ab8-c2f5-48d8-98c0-89cb007e406c" containerID="0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842" exitCode=0 Apr 17 20:11:44.959086 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.958935 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" Apr 17 20:11:44.959086 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.958957 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" event={"ID":"d4624ab8-c2f5-48d8-98c0-89cb007e406c","Type":"ContainerDied","Data":"0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842"} Apr 17 20:11:44.959086 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.959002 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2f94f" event={"ID":"d4624ab8-c2f5-48d8-98c0-89cb007e406c","Type":"ContainerDied","Data":"f26f00cf008b35713fae25348a71850113821473a7dda9bc3049cf0e12138f85"} Apr 17 20:11:44.959086 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.959029 2568 scope.go:117] "RemoveContainer" containerID="0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842" Apr 17 20:11:44.968137 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.968103 2568 scope.go:117] "RemoveContainer" containerID="0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842" Apr 17 20:11:44.968498 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:11:44.968390 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842\": container with ID starting with 0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842 not found: ID does not exist" containerID="0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842" Apr 17 20:11:44.968569 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:44.968507 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842"} err="failed to get container status \"0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842\": rpc error: code = NotFound desc = could not find container \"0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842\": container with ID starting with 0fb48b7e5c83e1c0576029bbe721d78490394622bf3f05546adf1a379b4d7842 not found: ID does not exist" Apr 17 20:11:45.002429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:45.002375 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2f94f"] Apr 17 20:11:45.010359 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:45.010324 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2f94f"] Apr 17 20:11:46.267807 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:46.267769 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4624ab8-c2f5-48d8-98c0-89cb007e406c" path="/var/lib/kubelet/pods/d4624ab8-c2f5-48d8-98c0-89cb007e406c/volumes" Apr 17 20:11:51.740217 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:51.740177 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:51.740718 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:51.740232 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:51.762425 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:51.762381 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:52.009196 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:52.009115 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-xd8dj" Apr 17 20:11:52.704772 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:52.704726 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:52.704963 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:52.704849 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:52.709313 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:52.709286 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:52.994779 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:52.994680 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7447894dd7-6jt4x" Apr 17 20:11:53.041836 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:53.041800 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d55486bdd-7z5gj"] Apr 17 20:11:56.024466 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.024432 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68"] Apr 17 20:11:56.024855 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.024841 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4624ab8-c2f5-48d8-98c0-89cb007e406c" containerName="registry-server" Apr 17 20:11:56.024897 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.024856 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4624ab8-c2f5-48d8-98c0-89cb007e406c" containerName="registry-server" Apr 17 20:11:56.024931 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.024920 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4624ab8-c2f5-48d8-98c0-89cb007e406c" containerName="registry-server" Apr 17 20:11:56.029724 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.029707 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.032806 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.032779 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-vg7cl\"" Apr 17 20:11:56.034774 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.034750 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68"] Apr 17 20:11:56.083759 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.083715 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9bwh\" (UniqueName: \"kubernetes.io/projected/7a058dc6-44e3-4561-9da9-b9904a3f943f-kube-api-access-d9bwh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.083941 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.083794 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.083941 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.083828 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.184635 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.184598 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.184817 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.184657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.184817 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.184768 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9bwh\" (UniqueName: \"kubernetes.io/projected/7a058dc6-44e3-4561-9da9-b9904a3f943f-kube-api-access-d9bwh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.185013 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.184991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.185084 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.185065 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.192829 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.192789 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9bwh\" (UniqueName: \"kubernetes.io/projected/7a058dc6-44e3-4561-9da9-b9904a3f943f-kube-api-access-d9bwh\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.341475 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.341355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:11:56.425563 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.425533 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82"] Apr 17 20:11:56.432848 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.432822 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.437204 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.437153 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82"] Apr 17 20:11:56.475380 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.475355 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68"] Apr 17 20:11:56.477276 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:11:56.477252 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a058dc6_44e3_4561_9da9_b9904a3f943f.slice/crio-0153bbba459d117b166e3830b04ba00d2b784bab7e0ba4d9f76f5d91f6cbb1b4 WatchSource:0}: Error finding container 0153bbba459d117b166e3830b04ba00d2b784bab7e0ba4d9f76f5d91f6cbb1b4: Status 404 returned error can't find the container with id 0153bbba459d117b166e3830b04ba00d2b784bab7e0ba4d9f76f5d91f6cbb1b4 Apr 17 20:11:56.487656 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.487632 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.487766 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.487692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.487809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.487771 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p2tk\" (UniqueName: \"kubernetes.io/projected/d55850c8-6230-4987-a24c-d0e9fc331992-kube-api-access-6p2tk\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.588840 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.588790 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.588992 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.588900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.588992 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.588955 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p2tk\" (UniqueName: \"kubernetes.io/projected/d55850c8-6230-4987-a24c-d0e9fc331992-kube-api-access-6p2tk\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.589210 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.589132 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.589271 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.589238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.596592 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.596535 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p2tk\" (UniqueName: \"kubernetes.io/projected/d55850c8-6230-4987-a24c-d0e9fc331992-kube-api-access-6p2tk\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.746793 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.746740 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:11:56.824771 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.824738 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v"] Apr 17 20:11:56.830127 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.830102 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:56.837949 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.837918 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v"] Apr 17 20:11:56.880098 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.880072 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82"] Apr 17 20:11:56.882039 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:11:56.882009 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd55850c8_6230_4987_a24c_d0e9fc331992.slice/crio-52ad659356e61393d51ee136527bfe00943d005764a802c7eb9118ba2649783f WatchSource:0}: Error finding container 52ad659356e61393d51ee136527bfe00943d005764a802c7eb9118ba2649783f: Status 404 returned error can't find the container with id 52ad659356e61393d51ee136527bfe00943d005764a802c7eb9118ba2649783f Apr 17 20:11:56.891765 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.891737 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:56.891883 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.891793 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:56.891883 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.891862 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbng9\" (UniqueName: \"kubernetes.io/projected/202b2da4-5271-41cb-8547-07ddfedd6401-kube-api-access-rbng9\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:56.992863 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.992826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbng9\" (UniqueName: \"kubernetes.io/projected/202b2da4-5271-41cb-8547-07ddfedd6401-kube-api-access-rbng9\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:56.993044 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.992927 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:56.993044 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.992978 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:56.993350 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.993328 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:56.993443 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:56.993364 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:57.001240 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.001212 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbng9\" (UniqueName: \"kubernetes.io/projected/202b2da4-5271-41cb-8547-07ddfedd6401-kube-api-access-rbng9\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:57.013611 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.013576 2568 generic.go:358] "Generic (PLEG): container finished" podID="d55850c8-6230-4987-a24c-d0e9fc331992" containerID="3970eb48acaf4af7c7377c37ca3bd2a5111f761d2519396524506fc27f89c21c" exitCode=0 Apr 17 20:11:57.013765 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.013648 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" event={"ID":"d55850c8-6230-4987-a24c-d0e9fc331992","Type":"ContainerDied","Data":"3970eb48acaf4af7c7377c37ca3bd2a5111f761d2519396524506fc27f89c21c"} Apr 17 20:11:57.013765 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.013679 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" event={"ID":"d55850c8-6230-4987-a24c-d0e9fc331992","Type":"ContainerStarted","Data":"52ad659356e61393d51ee136527bfe00943d005764a802c7eb9118ba2649783f"} Apr 17 20:11:57.018313 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.018285 2568 generic.go:358] "Generic (PLEG): container finished" podID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerID="86e094c43fdf8639791d5d3f474cb60b9fce72c422f4f8c314a7c94b8a95584b" exitCode=0 Apr 17 20:11:57.018481 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.018351 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" event={"ID":"7a058dc6-44e3-4561-9da9-b9904a3f943f","Type":"ContainerDied","Data":"86e094c43fdf8639791d5d3f474cb60b9fce72c422f4f8c314a7c94b8a95584b"} Apr 17 20:11:57.018481 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.018381 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" event={"ID":"7a058dc6-44e3-4561-9da9-b9904a3f943f","Type":"ContainerStarted","Data":"0153bbba459d117b166e3830b04ba00d2b784bab7e0ba4d9f76f5d91f6cbb1b4"} Apr 17 20:11:57.143558 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.143467 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:11:57.234274 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.234239 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs"] Apr 17 20:11:57.239519 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.239494 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.245200 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.245147 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs"] Apr 17 20:11:57.272255 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.272226 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v"] Apr 17 20:11:57.274353 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:11:57.274322 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod202b2da4_5271_41cb_8547_07ddfedd6401.slice/crio-6d11f4e2e9fa22a16e4b849afe8c8a490bc184777219b0c21d7d06bae0703a22 WatchSource:0}: Error finding container 6d11f4e2e9fa22a16e4b849afe8c8a490bc184777219b0c21d7d06bae0703a22: Status 404 returned error can't find the container with id 6d11f4e2e9fa22a16e4b849afe8c8a490bc184777219b0c21d7d06bae0703a22 Apr 17 20:11:57.296356 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.296315 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.296494 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.296413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.296578 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.296544 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vq5\" (UniqueName: \"kubernetes.io/projected/8ef65446-9a36-45f7-8828-0e53b27e6918-kube-api-access-87vq5\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.397638 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.397550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.397638 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.397612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.397828 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.397652 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87vq5\" (UniqueName: \"kubernetes.io/projected/8ef65446-9a36-45f7-8828-0e53b27e6918-kube-api-access-87vq5\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.398073 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.398049 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.398114 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.398048 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.405997 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.405974 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vq5\" (UniqueName: \"kubernetes.io/projected/8ef65446-9a36-45f7-8828-0e53b27e6918-kube-api-access-87vq5\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.552741 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.552702 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:11:57.728901 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:57.728818 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs"] Apr 17 20:11:57.793905 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:11:57.793865 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ef65446_9a36_45f7_8828_0e53b27e6918.slice/crio-c3968f73fc81a91e8bc88d4ae57d41680b04b4cd478ae54bed8f15800b4991de WatchSource:0}: Error finding container c3968f73fc81a91e8bc88d4ae57d41680b04b4cd478ae54bed8f15800b4991de: Status 404 returned error can't find the container with id c3968f73fc81a91e8bc88d4ae57d41680b04b4cd478ae54bed8f15800b4991de Apr 17 20:11:58.024212 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.024175 2568 generic.go:358] "Generic (PLEG): container finished" podID="d55850c8-6230-4987-a24c-d0e9fc331992" containerID="8135c1253b6ee4888d3d5276faa23c1521a31456a756613742fc35012bbb0038" exitCode=0 Apr 17 20:11:58.024367 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.024215 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" event={"ID":"d55850c8-6230-4987-a24c-d0e9fc331992","Type":"ContainerDied","Data":"8135c1253b6ee4888d3d5276faa23c1521a31456a756613742fc35012bbb0038"} Apr 17 20:11:58.025978 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.025884 2568 generic.go:358] "Generic (PLEG): container finished" podID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerID="ed8b4c0a2a34ab71e3ad9c4a4b89c163df50fad013ced1ebe596094c4e1e2496" exitCode=0 Apr 17 20:11:58.025978 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.025972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" event={"ID":"7a058dc6-44e3-4561-9da9-b9904a3f943f","Type":"ContainerDied","Data":"ed8b4c0a2a34ab71e3ad9c4a4b89c163df50fad013ced1ebe596094c4e1e2496"} Apr 17 20:11:58.027898 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.027865 2568 generic.go:358] "Generic (PLEG): container finished" podID="202b2da4-5271-41cb-8547-07ddfedd6401" containerID="c9b486fd0bf59a70ffe756e0b6f02279ee550f4364c0a65f336337b92687e299" exitCode=0 Apr 17 20:11:58.031447 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.028263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" event={"ID":"202b2da4-5271-41cb-8547-07ddfedd6401","Type":"ContainerDied","Data":"c9b486fd0bf59a70ffe756e0b6f02279ee550f4364c0a65f336337b92687e299"} Apr 17 20:11:58.031447 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.028295 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" event={"ID":"202b2da4-5271-41cb-8547-07ddfedd6401","Type":"ContainerStarted","Data":"6d11f4e2e9fa22a16e4b849afe8c8a490bc184777219b0c21d7d06bae0703a22"} Apr 17 20:11:58.035486 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.035449 2568 generic.go:358] "Generic (PLEG): container finished" podID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerID="460218b478d1b50e07ae8c40ce02707c5c0e2aefd874e8009139f249d3772dfb" exitCode=0 Apr 17 20:11:58.035638 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.035603 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" event={"ID":"8ef65446-9a36-45f7-8828-0e53b27e6918","Type":"ContainerDied","Data":"460218b478d1b50e07ae8c40ce02707c5c0e2aefd874e8009139f249d3772dfb"} Apr 17 20:11:58.035697 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:58.035632 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" event={"ID":"8ef65446-9a36-45f7-8828-0e53b27e6918","Type":"ContainerStarted","Data":"c3968f73fc81a91e8bc88d4ae57d41680b04b4cd478ae54bed8f15800b4991de"} Apr 17 20:11:59.041867 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:59.041765 2568 generic.go:358] "Generic (PLEG): container finished" podID="d55850c8-6230-4987-a24c-d0e9fc331992" containerID="9c2ea8758dae7d5a3f44c1e62e35807141359f8471281da49526b51280020306" exitCode=0 Apr 17 20:11:59.041867 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:59.041855 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" event={"ID":"d55850c8-6230-4987-a24c-d0e9fc331992","Type":"ContainerDied","Data":"9c2ea8758dae7d5a3f44c1e62e35807141359f8471281da49526b51280020306"} Apr 17 20:11:59.043693 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:59.043671 2568 generic.go:358] "Generic (PLEG): container finished" podID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerID="3abf63b30a1005cf9ac801cf37ec5fa0d0c2d16ff7af5b2e5d21bb83c319410a" exitCode=0 Apr 17 20:11:59.043809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:59.043749 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" event={"ID":"7a058dc6-44e3-4561-9da9-b9904a3f943f","Type":"ContainerDied","Data":"3abf63b30a1005cf9ac801cf37ec5fa0d0c2d16ff7af5b2e5d21bb83c319410a"} Apr 17 20:11:59.045342 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:59.045322 2568 generic.go:358] "Generic (PLEG): container finished" podID="202b2da4-5271-41cb-8547-07ddfedd6401" containerID="4be83b29d0315c7ad46f912d08d919910a480184b2e7cfb6a17fc1bd1c14b778" exitCode=0 Apr 17 20:11:59.045478 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:59.045415 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" event={"ID":"202b2da4-5271-41cb-8547-07ddfedd6401","Type":"ContainerDied","Data":"4be83b29d0315c7ad46f912d08d919910a480184b2e7cfb6a17fc1bd1c14b778"} Apr 17 20:11:59.047023 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:59.047002 2568 generic.go:358] "Generic (PLEG): container finished" podID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerID="41987cab567ea7ae5f8725818bc72d6e68bd616f3851929205ac4370cb9a983c" exitCode=0 Apr 17 20:11:59.047121 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:11:59.047067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" event={"ID":"8ef65446-9a36-45f7-8828-0e53b27e6918","Type":"ContainerDied","Data":"41987cab567ea7ae5f8725818bc72d6e68bd616f3851929205ac4370cb9a983c"} Apr 17 20:12:00.052390 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.052352 2568 generic.go:358] "Generic (PLEG): container finished" podID="202b2da4-5271-41cb-8547-07ddfedd6401" containerID="dd21e7a54817d565eb2b1fa0a571628c46565b1a150d8738a408eeea74a67995" exitCode=0 Apr 17 20:12:00.052848 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.052425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" event={"ID":"202b2da4-5271-41cb-8547-07ddfedd6401","Type":"ContainerDied","Data":"dd21e7a54817d565eb2b1fa0a571628c46565b1a150d8738a408eeea74a67995"} Apr 17 20:12:00.054332 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.054309 2568 generic.go:358] "Generic (PLEG): container finished" podID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerID="e73b922ec1c1f744442311bad664f070d4a9d16755cc522bc60d467dfd918069" exitCode=0 Apr 17 20:12:00.054462 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.054409 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" event={"ID":"8ef65446-9a36-45f7-8828-0e53b27e6918","Type":"ContainerDied","Data":"e73b922ec1c1f744442311bad664f070d4a9d16755cc522bc60d467dfd918069"} Apr 17 20:12:00.195715 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.195689 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:12:00.217518 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.217488 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:12:00.326926 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.326826 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-bundle\") pod \"7a058dc6-44e3-4561-9da9-b9904a3f943f\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " Apr 17 20:12:00.326926 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.326871 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-bundle\") pod \"d55850c8-6230-4987-a24c-d0e9fc331992\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " Apr 17 20:12:00.326926 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.326891 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-util\") pod \"7a058dc6-44e3-4561-9da9-b9904a3f943f\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " Apr 17 20:12:00.327200 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.326935 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-util\") pod \"d55850c8-6230-4987-a24c-d0e9fc331992\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " Apr 17 20:12:00.327200 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.327021 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9bwh\" (UniqueName: \"kubernetes.io/projected/7a058dc6-44e3-4561-9da9-b9904a3f943f-kube-api-access-d9bwh\") pod \"7a058dc6-44e3-4561-9da9-b9904a3f943f\" (UID: \"7a058dc6-44e3-4561-9da9-b9904a3f943f\") " Apr 17 20:12:00.327200 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.327039 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p2tk\" (UniqueName: \"kubernetes.io/projected/d55850c8-6230-4987-a24c-d0e9fc331992-kube-api-access-6p2tk\") pod \"d55850c8-6230-4987-a24c-d0e9fc331992\" (UID: \"d55850c8-6230-4987-a24c-d0e9fc331992\") " Apr 17 20:12:00.327547 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.327518 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-bundle" (OuterVolumeSpecName: "bundle") pod "d55850c8-6230-4987-a24c-d0e9fc331992" (UID: "d55850c8-6230-4987-a24c-d0e9fc331992"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:00.327674 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.327543 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-bundle" (OuterVolumeSpecName: "bundle") pod "7a058dc6-44e3-4561-9da9-b9904a3f943f" (UID: "7a058dc6-44e3-4561-9da9-b9904a3f943f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:00.329758 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.329734 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a058dc6-44e3-4561-9da9-b9904a3f943f-kube-api-access-d9bwh" (OuterVolumeSpecName: "kube-api-access-d9bwh") pod "7a058dc6-44e3-4561-9da9-b9904a3f943f" (UID: "7a058dc6-44e3-4561-9da9-b9904a3f943f"). InnerVolumeSpecName "kube-api-access-d9bwh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:12:00.329850 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.329765 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55850c8-6230-4987-a24c-d0e9fc331992-kube-api-access-6p2tk" (OuterVolumeSpecName: "kube-api-access-6p2tk") pod "d55850c8-6230-4987-a24c-d0e9fc331992" (UID: "d55850c8-6230-4987-a24c-d0e9fc331992"). InnerVolumeSpecName "kube-api-access-6p2tk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:12:00.332703 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.332675 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-util" (OuterVolumeSpecName: "util") pod "7a058dc6-44e3-4561-9da9-b9904a3f943f" (UID: "7a058dc6-44e3-4561-9da9-b9904a3f943f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:00.333365 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.333342 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-util" (OuterVolumeSpecName: "util") pod "d55850c8-6230-4987-a24c-d0e9fc331992" (UID: "d55850c8-6230-4987-a24c-d0e9fc331992"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:00.428618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.428565 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:00.428618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.428610 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d9bwh\" (UniqueName: \"kubernetes.io/projected/7a058dc6-44e3-4561-9da9-b9904a3f943f-kube-api-access-d9bwh\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:00.428618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.428622 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6p2tk\" (UniqueName: \"kubernetes.io/projected/d55850c8-6230-4987-a24c-d0e9fc331992-kube-api-access-6p2tk\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:00.428618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.428632 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:00.428618 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.428641 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d55850c8-6230-4987-a24c-d0e9fc331992-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:00.428907 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:00.428649 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a058dc6-44e3-4561-9da9-b9904a3f943f-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:01.059738 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.059693 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" event={"ID":"d55850c8-6230-4987-a24c-d0e9fc331992","Type":"ContainerDied","Data":"52ad659356e61393d51ee136527bfe00943d005764a802c7eb9118ba2649783f"} Apr 17 20:12:01.059738 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.059737 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52ad659356e61393d51ee136527bfe00943d005764a802c7eb9118ba2649783f" Apr 17 20:12:01.060215 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.059768 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82" Apr 17 20:12:01.061604 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.061569 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" Apr 17 20:12:01.061751 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.061593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68" event={"ID":"7a058dc6-44e3-4561-9da9-b9904a3f943f","Type":"ContainerDied","Data":"0153bbba459d117b166e3830b04ba00d2b784bab7e0ba4d9f76f5d91f6cbb1b4"} Apr 17 20:12:01.061751 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.061629 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0153bbba459d117b166e3830b04ba00d2b784bab7e0ba4d9f76f5d91f6cbb1b4" Apr 17 20:12:01.199451 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.199427 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:12:01.223850 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.223823 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:12:01.337864 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.337775 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87vq5\" (UniqueName: \"kubernetes.io/projected/8ef65446-9a36-45f7-8828-0e53b27e6918-kube-api-access-87vq5\") pod \"8ef65446-9a36-45f7-8828-0e53b27e6918\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " Apr 17 20:12:01.337864 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.337835 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-util\") pod \"202b2da4-5271-41cb-8547-07ddfedd6401\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " Apr 17 20:12:01.338084 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.337896 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-util\") pod \"8ef65446-9a36-45f7-8828-0e53b27e6918\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " Apr 17 20:12:01.338084 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.337925 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-bundle\") pod \"202b2da4-5271-41cb-8547-07ddfedd6401\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " Apr 17 20:12:01.338084 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.337967 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-bundle\") pod \"8ef65446-9a36-45f7-8828-0e53b27e6918\" (UID: \"8ef65446-9a36-45f7-8828-0e53b27e6918\") " Apr 17 20:12:01.338084 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.338023 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbng9\" (UniqueName: \"kubernetes.io/projected/202b2da4-5271-41cb-8547-07ddfedd6401-kube-api-access-rbng9\") pod \"202b2da4-5271-41cb-8547-07ddfedd6401\" (UID: \"202b2da4-5271-41cb-8547-07ddfedd6401\") " Apr 17 20:12:01.338590 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.338529 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-bundle" (OuterVolumeSpecName: "bundle") pod "8ef65446-9a36-45f7-8828-0e53b27e6918" (UID: "8ef65446-9a36-45f7-8828-0e53b27e6918"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:01.338729 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.338700 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-bundle" (OuterVolumeSpecName: "bundle") pod "202b2da4-5271-41cb-8547-07ddfedd6401" (UID: "202b2da4-5271-41cb-8547-07ddfedd6401"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:01.340188 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.340157 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef65446-9a36-45f7-8828-0e53b27e6918-kube-api-access-87vq5" (OuterVolumeSpecName: "kube-api-access-87vq5") pod "8ef65446-9a36-45f7-8828-0e53b27e6918" (UID: "8ef65446-9a36-45f7-8828-0e53b27e6918"). InnerVolumeSpecName "kube-api-access-87vq5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:12:01.340273 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.340178 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202b2da4-5271-41cb-8547-07ddfedd6401-kube-api-access-rbng9" (OuterVolumeSpecName: "kube-api-access-rbng9") pod "202b2da4-5271-41cb-8547-07ddfedd6401" (UID: "202b2da4-5271-41cb-8547-07ddfedd6401"). InnerVolumeSpecName "kube-api-access-rbng9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:12:01.342730 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.342698 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-util" (OuterVolumeSpecName: "util") pod "202b2da4-5271-41cb-8547-07ddfedd6401" (UID: "202b2da4-5271-41cb-8547-07ddfedd6401"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:01.343565 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.343546 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-util" (OuterVolumeSpecName: "util") pod "8ef65446-9a36-45f7-8828-0e53b27e6918" (UID: "8ef65446-9a36-45f7-8828-0e53b27e6918"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:01.439836 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.439784 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:01.439836 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.439818 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbng9\" (UniqueName: \"kubernetes.io/projected/202b2da4-5271-41cb-8547-07ddfedd6401-kube-api-access-rbng9\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:01.439836 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.439830 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87vq5\" (UniqueName: \"kubernetes.io/projected/8ef65446-9a36-45f7-8828-0e53b27e6918-kube-api-access-87vq5\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:01.439836 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.439839 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:01.439836 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.439850 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ef65446-9a36-45f7-8828-0e53b27e6918-util\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:01.440140 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:01.439858 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/202b2da4-5271-41cb-8547-07ddfedd6401-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:02.067165 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:02.067129 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" Apr 17 20:12:02.067678 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:02.067137 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs" event={"ID":"8ef65446-9a36-45f7-8828-0e53b27e6918","Type":"ContainerDied","Data":"c3968f73fc81a91e8bc88d4ae57d41680b04b4cd478ae54bed8f15800b4991de"} Apr 17 20:12:02.067678 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:02.067263 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3968f73fc81a91e8bc88d4ae57d41680b04b4cd478ae54bed8f15800b4991de" Apr 17 20:12:02.068924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:02.068901 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" event={"ID":"202b2da4-5271-41cb-8547-07ddfedd6401","Type":"ContainerDied","Data":"6d11f4e2e9fa22a16e4b849afe8c8a490bc184777219b0c21d7d06bae0703a22"} Apr 17 20:12:02.069041 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:02.068927 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d11f4e2e9fa22a16e4b849afe8c8a490bc184777219b0c21d7d06bae0703a22" Apr 17 20:12:02.069041 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:02.068954 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v" Apr 17 20:12:06.113443 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113389 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm"] Apr 17 20:12:06.113815 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113793 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="202b2da4-5271-41cb-8547-07ddfedd6401" containerName="util" Apr 17 20:12:06.113815 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113805 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="202b2da4-5271-41cb-8547-07ddfedd6401" containerName="util" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113820 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d55850c8-6230-4987-a24c-d0e9fc331992" containerName="extract" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113826 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55850c8-6230-4987-a24c-d0e9fc331992" containerName="extract" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113834 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerName="util" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113839 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerName="util" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113846 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="202b2da4-5271-41cb-8547-07ddfedd6401" containerName="pull" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113852 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="202b2da4-5271-41cb-8547-07ddfedd6401" containerName="pull" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113859 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerName="extract" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113864 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerName="extract" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113875 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d55850c8-6230-4987-a24c-d0e9fc331992" containerName="pull" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113879 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55850c8-6230-4987-a24c-d0e9fc331992" containerName="pull" Apr 17 20:12:06.113884 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113885 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerName="util" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113890 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerName="util" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113897 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerName="extract" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113902 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerName="extract" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113908 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerName="pull" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113913 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerName="pull" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113921 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="202b2da4-5271-41cb-8547-07ddfedd6401" containerName="extract" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113926 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="202b2da4-5271-41cb-8547-07ddfedd6401" containerName="extract" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113934 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d55850c8-6230-4987-a24c-d0e9fc331992" containerName="util" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113939 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55850c8-6230-4987-a24c-d0e9fc331992" containerName="util" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113945 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerName="pull" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.113950 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerName="pull" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.114002 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="202b2da4-5271-41cb-8547-07ddfedd6401" containerName="extract" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.114009 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ef65446-9a36-45f7-8828-0e53b27e6918" containerName="extract" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.114018 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d55850c8-6230-4987-a24c-d0e9fc331992" containerName="extract" Apr 17 20:12:06.114198 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.114026 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a058dc6-44e3-4561-9da9-b9904a3f943f" containerName="extract" Apr 17 20:12:06.117364 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.117344 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" Apr 17 20:12:06.119893 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.119867 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-q6mdn\"" Apr 17 20:12:06.120459 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.120442 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 20:12:06.128327 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.128297 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm"] Apr 17 20:12:06.288450 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.288377 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb55z\" (UniqueName: \"kubernetes.io/projected/acb83e07-8182-4590-9e7b-a1f8ad9cded6-kube-api-access-rb55z\") pod \"dns-operator-controller-manager-648d5c98bc-2zchm\" (UID: \"acb83e07-8182-4590-9e7b-a1f8ad9cded6\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" Apr 17 20:12:06.389795 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.389694 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rb55z\" (UniqueName: \"kubernetes.io/projected/acb83e07-8182-4590-9e7b-a1f8ad9cded6-kube-api-access-rb55z\") pod \"dns-operator-controller-manager-648d5c98bc-2zchm\" (UID: \"acb83e07-8182-4590-9e7b-a1f8ad9cded6\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" Apr 17 20:12:06.400366 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.400332 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb55z\" (UniqueName: \"kubernetes.io/projected/acb83e07-8182-4590-9e7b-a1f8ad9cded6-kube-api-access-rb55z\") pod \"dns-operator-controller-manager-648d5c98bc-2zchm\" (UID: \"acb83e07-8182-4590-9e7b-a1f8ad9cded6\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" Apr 17 20:12:06.429462 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.429423 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" Apr 17 20:12:06.576017 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:06.575989 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm"] Apr 17 20:12:06.577684 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:12:06.577650 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacb83e07_8182_4590_9e7b_a1f8ad9cded6.slice/crio-0b0e2e751c4d714ffab9a54ab7eee2fa76b5178123525de994c240d3e21e4d62 WatchSource:0}: Error finding container 0b0e2e751c4d714ffab9a54ab7eee2fa76b5178123525de994c240d3e21e4d62: Status 404 returned error can't find the container with id 0b0e2e751c4d714ffab9a54ab7eee2fa76b5178123525de994c240d3e21e4d62 Apr 17 20:12:07.088941 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:07.088904 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" event={"ID":"acb83e07-8182-4590-9e7b-a1f8ad9cded6","Type":"ContainerStarted","Data":"0b0e2e751c4d714ffab9a54ab7eee2fa76b5178123525de994c240d3e21e4d62"} Apr 17 20:12:10.105184 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.105148 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" event={"ID":"acb83e07-8182-4590-9e7b-a1f8ad9cded6","Type":"ContainerStarted","Data":"e4eddfa902df43a46bdcb3e5ae1be097e2f6966adb8105aad5183ffdf45f50d1"} Apr 17 20:12:10.105608 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.105203 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" Apr 17 20:12:10.122695 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.122638 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" podStartSLOduration=1.510726629 podStartE2EDuration="4.122623201s" podCreationTimestamp="2026-04-17 20:12:06 +0000 UTC" firstStartedPulling="2026-04-17 20:12:06.579903526 +0000 UTC m=+474.920057856" lastFinishedPulling="2026-04-17 20:12:09.191800088 +0000 UTC m=+477.531954428" observedRunningTime="2026-04-17 20:12:10.120697337 +0000 UTC m=+478.460851710" watchObservedRunningTime="2026-04-17 20:12:10.122623201 +0000 UTC m=+478.462777552" Apr 17 20:12:10.470825 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.470788 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qp2qt"] Apr 17 20:12:10.474558 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.474533 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" Apr 17 20:12:10.476809 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.476791 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-gcrwg\"" Apr 17 20:12:10.485778 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.485751 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qp2qt"] Apr 17 20:12:10.633489 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.633450 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5fj\" (UniqueName: \"kubernetes.io/projected/a3e7d7d2-8c73-41e7-94dc-cd3665d81c04-kube-api-access-rx5fj\") pod \"authorino-operator-657f44b778-qp2qt\" (UID: \"a3e7d7d2-8c73-41e7-94dc-cd3665d81c04\") " pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" Apr 17 20:12:10.734146 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.734056 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5fj\" (UniqueName: \"kubernetes.io/projected/a3e7d7d2-8c73-41e7-94dc-cd3665d81c04-kube-api-access-rx5fj\") pod \"authorino-operator-657f44b778-qp2qt\" (UID: \"a3e7d7d2-8c73-41e7-94dc-cd3665d81c04\") " pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" Apr 17 20:12:10.742749 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.742716 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5fj\" (UniqueName: \"kubernetes.io/projected/a3e7d7d2-8c73-41e7-94dc-cd3665d81c04-kube-api-access-rx5fj\") pod \"authorino-operator-657f44b778-qp2qt\" (UID: \"a3e7d7d2-8c73-41e7-94dc-cd3665d81c04\") " pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" Apr 17 20:12:10.790802 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.785739 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" Apr 17 20:12:10.935702 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:10.935673 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qp2qt"] Apr 17 20:12:10.937217 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:12:10.937185 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3e7d7d2_8c73_41e7_94dc_cd3665d81c04.slice/crio-41055ee6d8197ea9a88dbbc1f3a6c9b7c89977df83555129da859f282b8cf3a7 WatchSource:0}: Error finding container 41055ee6d8197ea9a88dbbc1f3a6c9b7c89977df83555129da859f282b8cf3a7: Status 404 returned error can't find the container with id 41055ee6d8197ea9a88dbbc1f3a6c9b7c89977df83555129da859f282b8cf3a7 Apr 17 20:12:11.110086 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:11.109987 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" event={"ID":"a3e7d7d2-8c73-41e7-94dc-cd3665d81c04","Type":"ContainerStarted","Data":"41055ee6d8197ea9a88dbbc1f3a6c9b7c89977df83555129da859f282b8cf3a7"} Apr 17 20:12:14.126667 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:14.126628 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" event={"ID":"a3e7d7d2-8c73-41e7-94dc-cd3665d81c04","Type":"ContainerStarted","Data":"149c03813e6d852c09868c11d86cf7aeb2379014616acfb8be6500570dc30601"} Apr 17 20:12:14.127123 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:14.126760 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" Apr 17 20:12:14.152466 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:14.152413 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" podStartSLOduration=1.981827598 podStartE2EDuration="4.152383334s" podCreationTimestamp="2026-04-17 20:12:10 +0000 UTC" firstStartedPulling="2026-04-17 20:12:10.939286834 +0000 UTC m=+479.279441163" lastFinishedPulling="2026-04-17 20:12:13.10984257 +0000 UTC m=+481.449996899" observedRunningTime="2026-04-17 20:12:14.150241214 +0000 UTC m=+482.490395567" watchObservedRunningTime="2026-04-17 20:12:14.152383334 +0000 UTC m=+482.492537685" Apr 17 20:12:18.063549 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.063474 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d55486bdd-7z5gj" podUID="ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" containerName="console" containerID="cri-o://c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677" gracePeriod=15 Apr 17 20:12:18.070455 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.070426 2568 patch_prober.go:28] interesting pod/console-7d55486bdd-7z5gj container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.24:8443/health\": dial tcp 10.133.0.24:8443: connect: connection refused" start-of-body= Apr 17 20:12:18.070571 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.070477 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-7d55486bdd-7z5gj" podUID="ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" containerName="console" probeResult="failure" output="Get \"https://10.133.0.24:8443/health\": dial tcp 10.133.0.24:8443: connect: connection refused" Apr 17 20:12:18.308750 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.308717 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d55486bdd-7z5gj_ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e/console/0.log" Apr 17 20:12:18.308907 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.308794 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:12:18.403204 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403113 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tn66\" (UniqueName: \"kubernetes.io/projected/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-kube-api-access-2tn66\") pod \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " Apr 17 20:12:18.403204 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403159 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-trusted-ca-bundle\") pod \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " Apr 17 20:12:18.403469 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403206 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-oauth-config\") pod \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " Apr 17 20:12:18.403469 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403261 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-oauth-serving-cert\") pod \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " Apr 17 20:12:18.403469 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403310 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-service-ca\") pod \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " Apr 17 20:12:18.403469 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403450 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-serving-cert\") pod \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " Apr 17 20:12:18.403683 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403506 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-config\") pod \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\" (UID: \"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e\") " Apr 17 20:12:18.403683 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403642 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" (UID: "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:12:18.403793 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403686 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" (UID: "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:12:18.403793 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403710 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-service-ca" (OuterVolumeSpecName: "service-ca") pod "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" (UID: "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:12:18.404017 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403994 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-trusted-ca-bundle\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:18.404102 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.404026 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-oauth-serving-cert\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:18.404102 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.404040 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-service-ca\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:18.404102 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.403995 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-config" (OuterVolumeSpecName: "console-config") pod "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" (UID: "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:12:18.405442 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.405422 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-kube-api-access-2tn66" (OuterVolumeSpecName: "kube-api-access-2tn66") pod "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" (UID: "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e"). InnerVolumeSpecName "kube-api-access-2tn66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:12:18.405520 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.405475 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" (UID: "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:12:18.405564 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.405519 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" (UID: "ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:12:18.505033 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.504992 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-serving-cert\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:18.505033 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.505028 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-config\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:18.505033 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.505039 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2tn66\" (UniqueName: \"kubernetes.io/projected/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-kube-api-access-2tn66\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:18.505275 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:18.505049 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e-console-oauth-config\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:19.148621 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.148592 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d55486bdd-7z5gj_ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e/console/0.log" Apr 17 20:12:19.149008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.148633 2568 generic.go:358] "Generic (PLEG): container finished" podID="ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" containerID="c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677" exitCode=2 Apr 17 20:12:19.149008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.148702 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d55486bdd-7z5gj" Apr 17 20:12:19.149008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.148728 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d55486bdd-7z5gj" event={"ID":"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e","Type":"ContainerDied","Data":"c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677"} Apr 17 20:12:19.149008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.148769 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d55486bdd-7z5gj" event={"ID":"ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e","Type":"ContainerDied","Data":"4ab94cc22a924bd275e7ddd98046256fe3c7f9ea0904ec3fbc9c455e77094e84"} Apr 17 20:12:19.149008 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.148788 2568 scope.go:117] "RemoveContainer" containerID="c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677" Apr 17 20:12:19.158231 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.158211 2568 scope.go:117] "RemoveContainer" containerID="c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677" Apr 17 20:12:19.158587 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:12:19.158558 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677\": container with ID starting with c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677 not found: ID does not exist" containerID="c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677" Apr 17 20:12:19.158648 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.158599 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677"} err="failed to get container status \"c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677\": rpc error: code = NotFound desc = could not find container \"c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677\": container with ID starting with c1f789290b640b9ddd56efa60a94a97995a36f1f342fd853932a97b80c27e677 not found: ID does not exist" Apr 17 20:12:19.173165 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.173133 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d55486bdd-7z5gj"] Apr 17 20:12:19.182548 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:19.182515 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d55486bdd-7z5gj"] Apr 17 20:12:20.268058 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:20.268014 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" path="/var/lib/kubelet/pods/ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e/volumes" Apr 17 20:12:21.113362 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:21.113326 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2zchm" Apr 17 20:12:25.133546 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:25.133513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-qp2qt" Apr 17 20:12:27.151654 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.151612 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2"] Apr 17 20:12:27.152124 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.152059 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" containerName="console" Apr 17 20:12:27.152124 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.152076 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" containerName="console" Apr 17 20:12:27.152242 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.152192 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed2dcf27-72e1-455f-8c5a-3c3e6cb3767e" containerName="console" Apr 17 20:12:27.155454 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.155434 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.157882 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.157854 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 20:12:27.158004 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.157857 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 20:12:27.158004 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.157915 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-vg7cl\"" Apr 17 20:12:27.162177 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.162155 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2"] Apr 17 20:12:27.186765 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.186732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/013304f1-f25d-431d-9be5-38b6cd661b04-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.186942 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.186794 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwr8r\" (UniqueName: \"kubernetes.io/projected/013304f1-f25d-431d-9be5-38b6cd661b04-kube-api-access-zwr8r\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.186942 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.186843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/013304f1-f25d-431d-9be5-38b6cd661b04-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.287672 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.287627 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/013304f1-f25d-431d-9be5-38b6cd661b04-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.287876 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.287730 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwr8r\" (UniqueName: \"kubernetes.io/projected/013304f1-f25d-431d-9be5-38b6cd661b04-kube-api-access-zwr8r\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.287876 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:12:27.287769 2568 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 20:12:27.287876 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:12:27.287832 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/013304f1-f25d-431d-9be5-38b6cd661b04-plugin-serving-cert podName:013304f1-f25d-431d-9be5-38b6cd661b04 nodeName:}" failed. No retries permitted until 2026-04-17 20:12:27.787815101 +0000 UTC m=+496.127969429 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/013304f1-f25d-431d-9be5-38b6cd661b04-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-nrlc2" (UID: "013304f1-f25d-431d-9be5-38b6cd661b04") : secret "plugin-serving-cert" not found Apr 17 20:12:27.287876 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.287773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/013304f1-f25d-431d-9be5-38b6cd661b04-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.288553 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.288531 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/013304f1-f25d-431d-9be5-38b6cd661b04-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.298131 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.298099 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwr8r\" (UniqueName: \"kubernetes.io/projected/013304f1-f25d-431d-9be5-38b6cd661b04-kube-api-access-zwr8r\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.792388 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.792340 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/013304f1-f25d-431d-9be5-38b6cd661b04-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:27.794819 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:27.794785 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/013304f1-f25d-431d-9be5-38b6cd661b04-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-nrlc2\" (UID: \"013304f1-f25d-431d-9be5-38b6cd661b04\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:28.066941 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:28.066850 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" Apr 17 20:12:28.194920 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:28.194890 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2"] Apr 17 20:12:28.196543 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:12:28.196518 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod013304f1_f25d_431d_9be5_38b6cd661b04.slice/crio-1cadd1a75e19ad572f0d7d3e15769d1d5852914a31c090c27153308d8f506d88 WatchSource:0}: Error finding container 1cadd1a75e19ad572f0d7d3e15769d1d5852914a31c090c27153308d8f506d88: Status 404 returned error can't find the container with id 1cadd1a75e19ad572f0d7d3e15769d1d5852914a31c090c27153308d8f506d88 Apr 17 20:12:29.191443 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:29.191366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" event={"ID":"013304f1-f25d-431d-9be5-38b6cd661b04","Type":"ContainerStarted","Data":"1cadd1a75e19ad572f0d7d3e15769d1d5852914a31c090c27153308d8f506d88"} Apr 17 20:12:37.333081 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.333010 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck"] Apr 17 20:12:37.338527 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.338497 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:37.341541 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.341516 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rbv24\"" Apr 17 20:12:37.349657 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.349078 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck"] Apr 17 20:12:37.389353 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.389312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b25765c-d266-4cfe-92e2-fd271493b2ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" (UID: \"9b25765c-d266-4cfe-92e2-fd271493b2ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:37.389557 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.389369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh6lq\" (UniqueName: \"kubernetes.io/projected/9b25765c-d266-4cfe-92e2-fd271493b2ee-kube-api-access-mh6lq\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" (UID: \"9b25765c-d266-4cfe-92e2-fd271493b2ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:37.490267 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.490220 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b25765c-d266-4cfe-92e2-fd271493b2ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" (UID: \"9b25765c-d266-4cfe-92e2-fd271493b2ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:37.490492 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.490288 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh6lq\" (UniqueName: \"kubernetes.io/projected/9b25765c-d266-4cfe-92e2-fd271493b2ee-kube-api-access-mh6lq\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" (UID: \"9b25765c-d266-4cfe-92e2-fd271493b2ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:37.490753 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.490725 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b25765c-d266-4cfe-92e2-fd271493b2ee-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" (UID: \"9b25765c-d266-4cfe-92e2-fd271493b2ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:37.501943 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.501895 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh6lq\" (UniqueName: \"kubernetes.io/projected/9b25765c-d266-4cfe-92e2-fd271493b2ee-kube-api-access-mh6lq\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" (UID: \"9b25765c-d266-4cfe-92e2-fd271493b2ee\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:37.653798 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:37.653700 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:38.004876 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.004818 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck"] Apr 17 20:12:38.009118 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.009085 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck"] Apr 17 20:12:38.031726 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.031691 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps"] Apr 17 20:12:38.037939 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.037909 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:38.053776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.053720 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps"] Apr 17 20:12:38.097261 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.097226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps\" (UID: \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:38.097481 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.097307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2c72\" (UniqueName: \"kubernetes.io/projected/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-kube-api-access-g2c72\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps\" (UID: \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:38.197843 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.197804 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2c72\" (UniqueName: \"kubernetes.io/projected/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-kube-api-access-g2c72\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps\" (UID: \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:38.198021 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.197926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps\" (UID: \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:38.198317 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.198296 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps\" (UID: \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:38.206604 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.206564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2c72\" (UniqueName: \"kubernetes.io/projected/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-kube-api-access-g2c72\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps\" (UID: \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:38.355898 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:38.355777 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:51.742105 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:51.742077 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps"] Apr 17 20:12:51.746862 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:12:51.746829 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31d5ea4e_4b99_4b4a_806a_74fc9d61c4b0.slice/crio-2ef6dfd09e030506b2f6c2a02310bda8fbc6b3b9b2e8e41d9fac909a28ad8655 WatchSource:0}: Error finding container 2ef6dfd09e030506b2f6c2a02310bda8fbc6b3b9b2e8e41d9fac909a28ad8655: Status 404 returned error can't find the container with id 2ef6dfd09e030506b2f6c2a02310bda8fbc6b3b9b2e8e41d9fac909a28ad8655 Apr 17 20:12:52.298060 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:52.298020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" event={"ID":"013304f1-f25d-431d-9be5-38b6cd661b04","Type":"ContainerStarted","Data":"f6a7324e6b65b3d4cb2d55b06d27d1acf4eac84d2235cc00a2e0fedcbd7db1f8"} Apr 17 20:12:52.307430 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:52.307375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" event={"ID":"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0","Type":"ContainerStarted","Data":"2ef6dfd09e030506b2f6c2a02310bda8fbc6b3b9b2e8e41d9fac909a28ad8655"} Apr 17 20:12:52.317781 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:52.317722 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-nrlc2" podStartSLOduration=1.817891854 podStartE2EDuration="25.317702876s" podCreationTimestamp="2026-04-17 20:12:27 +0000 UTC" firstStartedPulling="2026-04-17 20:12:28.197845516 +0000 UTC m=+496.537999844" lastFinishedPulling="2026-04-17 20:12:51.697656536 +0000 UTC m=+520.037810866" observedRunningTime="2026-04-17 20:12:52.314832824 +0000 UTC m=+520.654987176" watchObservedRunningTime="2026-04-17 20:12:52.317702876 +0000 UTC m=+520.657857230" Apr 17 20:12:54.139715 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:12:54.139678 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b25765c_d266_4cfe_92e2_fd271493b2ee.slice/crio-b0fd7fac5981e5f3e114b40e4f5586067b9e70d0e7d91f6ff1e115407b0058cd WatchSource:0}: Error finding container b0fd7fac5981e5f3e114b40e4f5586067b9e70d0e7d91f6ff1e115407b0058cd: Status 404 returned error can't find the container with id b0fd7fac5981e5f3e114b40e4f5586067b9e70d0e7d91f6ff1e115407b0058cd Apr 17 20:12:57.333220 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.333178 2568 generic.go:358] "Generic (PLEG): container finished" podID="9b25765c-d266-4cfe-92e2-fd271493b2ee" containerID="d88f3de5b4b4a6fa02904a5bba9df9ba7f162a3b0eb971259af11ee87bcf3b5d" exitCode=1 Apr 17 20:12:57.335224 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.335188 2568 status_manager.go:895] "Failed to get status for pod" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" is forbidden: User \"system:node:ip-10-0-130-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-159.ec2.internal' and this object" Apr 17 20:12:57.335349 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.335263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" event={"ID":"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0","Type":"ContainerStarted","Data":"7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb"} Apr 17 20:12:57.335578 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.335557 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:12:57.337859 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.337827 2568 status_manager.go:895] "Failed to get status for pod" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" is forbidden: User \"system:node:ip-10-0-130-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-159.ec2.internal' and this object" Apr 17 20:12:57.359266 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.359218 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" podStartSLOduration=14.533278565 podStartE2EDuration="19.359202285s" podCreationTimestamp="2026-04-17 20:12:38 +0000 UTC" firstStartedPulling="2026-04-17 20:12:51.749307246 +0000 UTC m=+520.089461576" lastFinishedPulling="2026-04-17 20:12:56.57523096 +0000 UTC m=+524.915385296" observedRunningTime="2026-04-17 20:12:57.357664076 +0000 UTC m=+525.697818427" watchObservedRunningTime="2026-04-17 20:12:57.359202285 +0000 UTC m=+525.699356636" Apr 17 20:12:57.369666 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.369641 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:57.371707 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.371681 2568 status_manager.go:895] "Failed to get status for pod" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" is forbidden: User \"system:node:ip-10-0-130-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-159.ec2.internal' and this object" Apr 17 20:12:57.493078 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.493044 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh6lq\" (UniqueName: \"kubernetes.io/projected/9b25765c-d266-4cfe-92e2-fd271493b2ee-kube-api-access-mh6lq\") pod \"9b25765c-d266-4cfe-92e2-fd271493b2ee\" (UID: \"9b25765c-d266-4cfe-92e2-fd271493b2ee\") " Apr 17 20:12:57.493270 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.493091 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b25765c-d266-4cfe-92e2-fd271493b2ee-extensions-socket-volume\") pod \"9b25765c-d266-4cfe-92e2-fd271493b2ee\" (UID: \"9b25765c-d266-4cfe-92e2-fd271493b2ee\") " Apr 17 20:12:57.493415 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.493368 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b25765c-d266-4cfe-92e2-fd271493b2ee-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "9b25765c-d266-4cfe-92e2-fd271493b2ee" (UID: "9b25765c-d266-4cfe-92e2-fd271493b2ee"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:12:57.495180 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.495150 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b25765c-d266-4cfe-92e2-fd271493b2ee-kube-api-access-mh6lq" (OuterVolumeSpecName: "kube-api-access-mh6lq") pod "9b25765c-d266-4cfe-92e2-fd271493b2ee" (UID: "9b25765c-d266-4cfe-92e2-fd271493b2ee"). InnerVolumeSpecName "kube-api-access-mh6lq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:12:57.593984 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.593896 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mh6lq\" (UniqueName: \"kubernetes.io/projected/9b25765c-d266-4cfe-92e2-fd271493b2ee-kube-api-access-mh6lq\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:57.593984 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:57.593933 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b25765c-d266-4cfe-92e2-fd271493b2ee-extensions-socket-volume\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:12:58.268378 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:58.268344 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" path="/var/lib/kubelet/pods/9b25765c-d266-4cfe-92e2-fd271493b2ee/volumes" Apr 17 20:12:58.340632 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:58.340604 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" Apr 17 20:12:58.341091 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:58.340635 2568 scope.go:117] "RemoveContainer" containerID="d88f3de5b4b4a6fa02904a5bba9df9ba7f162a3b0eb971259af11ee87bcf3b5d" Apr 17 20:12:58.342962 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:58.342925 2568 status_manager.go:895] "Failed to get status for pod" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" is forbidden: User \"system:node:ip-10-0-130-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-159.ec2.internal' and this object" Apr 17 20:12:58.345002 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:12:58.344980 2568 status_manager.go:895] "Failed to get status for pod" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-pbjck" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-pbjck\" is forbidden: User \"system:node:ip-10-0-130-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-159.ec2.internal' and this object" Apr 17 20:13:08.343089 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:08.343001 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:13:20.536291 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:20.536252 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps"] Apr 17 20:13:20.537081 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:20.536566 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" podUID="31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0" containerName="manager" containerID="cri-o://7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb" gracePeriod=10 Apr 17 20:13:20.803287 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:20.803261 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:13:20.912348 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:20.912309 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-extensions-socket-volume\") pod \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\" (UID: \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\") " Apr 17 20:13:20.912542 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:20.912418 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2c72\" (UniqueName: \"kubernetes.io/projected/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-kube-api-access-g2c72\") pod \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\" (UID: \"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0\") " Apr 17 20:13:20.912748 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:20.912713 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0" (UID: "31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:13:20.914776 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:20.914752 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-kube-api-access-g2c72" (OuterVolumeSpecName: "kube-api-access-g2c72") pod "31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0" (UID: "31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0"). InnerVolumeSpecName "kube-api-access-g2c72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:13:21.013110 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.013068 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-extensions-socket-volume\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:13:21.013110 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.013102 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2c72\" (UniqueName: \"kubernetes.io/projected/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0-kube-api-access-g2c72\") on node \"ip-10-0-130-159.ec2.internal\" DevicePath \"\"" Apr 17 20:13:21.434009 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.433970 2568 generic.go:358] "Generic (PLEG): container finished" podID="31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0" containerID="7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb" exitCode=0 Apr 17 20:13:21.434188 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.434020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" event={"ID":"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0","Type":"ContainerDied","Data":"7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb"} Apr 17 20:13:21.434188 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.434034 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" Apr 17 20:13:21.434188 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.434047 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps" event={"ID":"31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0","Type":"ContainerDied","Data":"2ef6dfd09e030506b2f6c2a02310bda8fbc6b3b9b2e8e41d9fac909a28ad8655"} Apr 17 20:13:21.434188 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.434064 2568 scope.go:117] "RemoveContainer" containerID="7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb" Apr 17 20:13:21.443896 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.443876 2568 scope.go:117] "RemoveContainer" containerID="7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb" Apr 17 20:13:21.444139 ip-10-0-130-159 kubenswrapper[2568]: E0417 20:13:21.444123 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb\": container with ID starting with 7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb not found: ID does not exist" containerID="7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb" Apr 17 20:13:21.444190 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.444147 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb"} err="failed to get container status \"7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb\": rpc error: code = NotFound desc = could not find container \"7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb\": container with ID starting with 7f6fa4d1f5a09c4d5d9b4c4c69e3847972272c629ab8aeac3f56de603e30b8cb not found: ID does not exist" Apr 17 20:13:21.459581 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.459553 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps"] Apr 17 20:13:21.463172 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:21.463140 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-6p4ps"] Apr 17 20:13:22.269050 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:22.269016 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0" path="/var/lib/kubelet/pods/31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0/volumes" Apr 17 20:13:24.565129 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.565093 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57"] Apr 17 20:13:24.565622 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.565601 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0" containerName="manager" Apr 17 20:13:24.565698 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.565625 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0" containerName="manager" Apr 17 20:13:24.565698 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.565668 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" containerName="manager" Apr 17 20:13:24.565698 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.565676 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" containerName="manager" Apr 17 20:13:24.565847 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.565767 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="31d5ea4e-4b99-4b4a-806a-74fc9d61c4b0" containerName="manager" Apr 17 20:13:24.565847 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.565783 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b25765c-d266-4cfe-92e2-fd271493b2ee" containerName="manager" Apr 17 20:13:24.599212 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.590597 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57"] Apr 17 20:13:24.599212 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.590762 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.599212 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.593604 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-9hxxj\"" Apr 17 20:13:24.649308 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649258 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.649516 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649324 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.649516 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649388 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.649516 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.649516 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649472 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.649516 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649493 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.649516 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8wts\" (UniqueName: \"kubernetes.io/projected/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-kube-api-access-s8wts\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.649718 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649568 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.649718 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.649604 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.750946 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.750907 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751125 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.750956 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751125 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.750993 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751125 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751017 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8wts\" (UniqueName: \"kubernetes.io/projected/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-kube-api-access-s8wts\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751125 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751125 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751329 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751253 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751378 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751475 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751446 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751538 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751390 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751614 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751758 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751712 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751869 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751850 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.751929 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.751884 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.753801 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.753778 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.753887 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.753870 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.759387 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.759367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.759564 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.759543 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8wts\" (UniqueName: \"kubernetes.io/projected/9ed94358-a04c-4cde-88bb-d5f6e8fbaa47-kube-api-access-s8wts\") pod \"maas-default-gateway-openshift-default-845c6b4b48-dgc57\" (UID: \"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:24.910818 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:24.910714 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:25.043777 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:25.043751 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57"] Apr 17 20:13:25.045713 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:13:25.045685 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed94358_a04c_4cde_88bb_d5f6e8fbaa47.slice/crio-50bcfe646fc47f99d81b56c02fc4aa093f2a007210e1318bec8e5cd3d9a1db0a WatchSource:0}: Error finding container 50bcfe646fc47f99d81b56c02fc4aa093f2a007210e1318bec8e5cd3d9a1db0a: Status 404 returned error can't find the container with id 50bcfe646fc47f99d81b56c02fc4aa093f2a007210e1318bec8e5cd3d9a1db0a Apr 17 20:13:25.048039 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:25.048000 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 20:13:25.048125 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:25.048084 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 20:13:25.048167 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:25.048131 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 20:13:25.454414 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:25.454355 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" event={"ID":"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47","Type":"ContainerStarted","Data":"e5574f5b7d4b75c792d4f20a011e316793e0557fcebd9a79aa63d23ff979ec23"} Apr 17 20:13:25.454414 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:25.454411 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" event={"ID":"9ed94358-a04c-4cde-88bb-d5f6e8fbaa47","Type":"ContainerStarted","Data":"50bcfe646fc47f99d81b56c02fc4aa093f2a007210e1318bec8e5cd3d9a1db0a"} Apr 17 20:13:25.478807 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:25.478756 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" podStartSLOduration=1.478741546 podStartE2EDuration="1.478741546s" podCreationTimestamp="2026-04-17 20:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:13:25.476138773 +0000 UTC m=+553.816293126" watchObservedRunningTime="2026-04-17 20:13:25.478741546 +0000 UTC m=+553.818895896" Apr 17 20:13:25.911615 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:25.911526 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:26.916448 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:26.916390 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:27.463273 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:27.463241 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:27.464429 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:27.464383 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-dgc57" Apr 17 20:13:28.721193 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.721158 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:13:28.724127 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.724108 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:28.726482 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.726456 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 20:13:28.731907 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.731881 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:13:28.761372 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.761337 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:13:28.792296 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.792253 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwpfg\" (UniqueName: \"kubernetes.io/projected/d6ae5458-b698-46da-95ff-32c0c3ff0e6f-kube-api-access-wwpfg\") pod \"limitador-limitador-78c99df468-qbbdp\" (UID: \"d6ae5458-b698-46da-95ff-32c0c3ff0e6f\") " pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:28.792296 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.792295 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d6ae5458-b698-46da-95ff-32c0c3ff0e6f-config-file\") pod \"limitador-limitador-78c99df468-qbbdp\" (UID: \"d6ae5458-b698-46da-95ff-32c0c3ff0e6f\") " pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:28.893467 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.893373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwpfg\" (UniqueName: \"kubernetes.io/projected/d6ae5458-b698-46da-95ff-32c0c3ff0e6f-kube-api-access-wwpfg\") pod \"limitador-limitador-78c99df468-qbbdp\" (UID: \"d6ae5458-b698-46da-95ff-32c0c3ff0e6f\") " pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:28.893685 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.893487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d6ae5458-b698-46da-95ff-32c0c3ff0e6f-config-file\") pod \"limitador-limitador-78c99df468-qbbdp\" (UID: \"d6ae5458-b698-46da-95ff-32c0c3ff0e6f\") " pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:28.894104 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.894081 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d6ae5458-b698-46da-95ff-32c0c3ff0e6f-config-file\") pod \"limitador-limitador-78c99df468-qbbdp\" (UID: \"d6ae5458-b698-46da-95ff-32c0c3ff0e6f\") " pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:28.902155 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:28.902129 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwpfg\" (UniqueName: \"kubernetes.io/projected/d6ae5458-b698-46da-95ff-32c0c3ff0e6f-kube-api-access-wwpfg\") pod \"limitador-limitador-78c99df468-qbbdp\" (UID: \"d6ae5458-b698-46da-95ff-32c0c3ff0e6f\") " pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:29.037476 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:29.037360 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:29.166659 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:29.166630 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:13:29.169065 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:13:29.169030 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ae5458_b698_46da_95ff_32c0c3ff0e6f.slice/crio-7eea44221a79ef5b8357e0dba6bcd9e5878e836386456ffd68bb4b11808d2784 WatchSource:0}: Error finding container 7eea44221a79ef5b8357e0dba6bcd9e5878e836386456ffd68bb4b11808d2784: Status 404 returned error can't find the container with id 7eea44221a79ef5b8357e0dba6bcd9e5878e836386456ffd68bb4b11808d2784 Apr 17 20:13:29.472545 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:29.472511 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" event={"ID":"d6ae5458-b698-46da-95ff-32c0c3ff0e6f","Type":"ContainerStarted","Data":"7eea44221a79ef5b8357e0dba6bcd9e5878e836386456ffd68bb4b11808d2784"} Apr 17 20:13:32.488993 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:32.488957 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" event={"ID":"d6ae5458-b698-46da-95ff-32c0c3ff0e6f","Type":"ContainerStarted","Data":"51759a75b7c9e59998e2b06ee05e674907a7ac34121510a783f1a7dcbc30c836"} Apr 17 20:13:32.489520 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:32.489071 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:13:32.505355 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:32.505297 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" podStartSLOduration=1.999983357 podStartE2EDuration="4.505279546s" podCreationTimestamp="2026-04-17 20:13:28 +0000 UTC" firstStartedPulling="2026-04-17 20:13:29.171036466 +0000 UTC m=+557.511190796" lastFinishedPulling="2026-04-17 20:13:31.676332654 +0000 UTC m=+560.016486985" observedRunningTime="2026-04-17 20:13:32.503530786 +0000 UTC m=+560.843685136" watchObservedRunningTime="2026-04-17 20:13:32.505279546 +0000 UTC m=+560.845433897" Apr 17 20:13:43.497940 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:13:43.497907 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-qbbdp" Apr 17 20:14:05.606547 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:14:05.606508 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:14:12.186096 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:14:12.186067 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:14:12.186739 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:14:12.186716 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:14:12.190222 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:14:12.190199 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:14:12.190970 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:14:12.190953 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:15:00.382689 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:15:00.382638 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:15:08.185951 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:15:08.185906 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:15:27.187651 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:15:27.187614 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:15:32.981228 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:15:32.981185 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:15:38.089461 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:15:38.089426 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qbbdp"] Apr 17 20:19:12.221360 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:12.221274 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:19:12.222054 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:12.222036 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:19:12.228848 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:12.228821 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:19:12.229409 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:12.229377 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:19:35.949410 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:35.949374 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn_8064ee4e-f005-441a-8dcf-9dae03e7d1b1/manager/0.log" Apr 17 20:19:37.005209 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.005178 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs_8ef65446-9a36-45f7-8828-0e53b27e6918/util/0.log" Apr 17 20:19:37.010966 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.010940 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs_8ef65446-9a36-45f7-8828-0e53b27e6918/pull/0.log" Apr 17 20:19:37.016348 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.016328 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs_8ef65446-9a36-45f7-8828-0e53b27e6918/extract/0.log" Apr 17 20:19:37.123921 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.123893 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82_d55850c8-6230-4987-a24c-d0e9fc331992/util/0.log" Apr 17 20:19:37.129637 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.129610 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82_d55850c8-6230-4987-a24c-d0e9fc331992/pull/0.log" Apr 17 20:19:37.134953 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.134927 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82_d55850c8-6230-4987-a24c-d0e9fc331992/extract/0.log" Apr 17 20:19:37.237883 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.237857 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v_202b2da4-5271-41cb-8547-07ddfedd6401/util/0.log" Apr 17 20:19:37.244170 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.244148 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v_202b2da4-5271-41cb-8547-07ddfedd6401/pull/0.log" Apr 17 20:19:37.249924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.249903 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v_202b2da4-5271-41cb-8547-07ddfedd6401/extract/0.log" Apr 17 20:19:37.352887 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.352812 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68_7a058dc6-44e3-4561-9da9-b9904a3f943f/util/0.log" Apr 17 20:19:37.359150 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.359125 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68_7a058dc6-44e3-4561-9da9-b9904a3f943f/pull/0.log" Apr 17 20:19:37.365926 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.365898 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68_7a058dc6-44e3-4561-9da9-b9904a3f943f/extract/0.log" Apr 17 20:19:37.579893 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.579864 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-qp2qt_a3e7d7d2-8c73-41e7-94dc-cd3665d81c04/manager/0.log" Apr 17 20:19:37.683663 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.683641 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-2zchm_acb83e07-8182-4590-9e7b-a1f8ad9cded6/manager/0.log" Apr 17 20:19:37.788749 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.788721 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-nrlc2_013304f1-f25d-431d-9be5-38b6cd661b04/kuadrant-console-plugin/0.log" Apr 17 20:19:37.896777 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:37.896746 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-xd8dj_ed2e018a-ad4f-479f-9a8c-b68f680c7ca5/registry-server/0.log" Apr 17 20:19:38.113638 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:38.113561 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-qbbdp_d6ae5458-b698-46da-95ff-32c0c3ff0e6f/limitador/0.log" Apr 17 20:19:38.557338 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:38.557309 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f5wwf2_17c4a279-6c87-455d-aa72-5a7d05af451b/istio-proxy/0.log" Apr 17 20:19:38.989341 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:38.989311 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-dgc57_9ed94358-a04c-4cde-88bb-d5f6e8fbaa47/istio-proxy/0.log" Apr 17 20:19:43.503116 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.503074 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrnl5/must-gather-5c82c"] Apr 17 20:19:43.507210 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.507185 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrnl5/must-gather-5c82c" Apr 17 20:19:43.509522 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.509500 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrnl5\"/\"kube-root-ca.crt\"" Apr 17 20:19:43.509645 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.509543 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrnl5\"/\"default-dockercfg-fxgz8\"" Apr 17 20:19:43.510444 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.510428 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrnl5\"/\"openshift-service-ca.crt\"" Apr 17 20:19:43.515143 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.515022 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrnl5/must-gather-5c82c"] Apr 17 20:19:43.669049 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.669001 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sczd5\" (UniqueName: \"kubernetes.io/projected/689af908-d09d-45db-b624-a3f5f369d6b8-kube-api-access-sczd5\") pod \"must-gather-5c82c\" (UID: \"689af908-d09d-45db-b624-a3f5f369d6b8\") " pod="openshift-must-gather-jrnl5/must-gather-5c82c" Apr 17 20:19:43.669239 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.669163 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/689af908-d09d-45db-b624-a3f5f369d6b8-must-gather-output\") pod \"must-gather-5c82c\" (UID: \"689af908-d09d-45db-b624-a3f5f369d6b8\") " pod="openshift-must-gather-jrnl5/must-gather-5c82c" Apr 17 20:19:43.770649 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.770538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/689af908-d09d-45db-b624-a3f5f369d6b8-must-gather-output\") pod \"must-gather-5c82c\" (UID: \"689af908-d09d-45db-b624-a3f5f369d6b8\") " pod="openshift-must-gather-jrnl5/must-gather-5c82c" Apr 17 20:19:43.770649 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.770628 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sczd5\" (UniqueName: \"kubernetes.io/projected/689af908-d09d-45db-b624-a3f5f369d6b8-kube-api-access-sczd5\") pod \"must-gather-5c82c\" (UID: \"689af908-d09d-45db-b624-a3f5f369d6b8\") " pod="openshift-must-gather-jrnl5/must-gather-5c82c" Apr 17 20:19:43.770903 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.770880 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/689af908-d09d-45db-b624-a3f5f369d6b8-must-gather-output\") pod \"must-gather-5c82c\" (UID: \"689af908-d09d-45db-b624-a3f5f369d6b8\") " pod="openshift-must-gather-jrnl5/must-gather-5c82c" Apr 17 20:19:43.778637 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.778609 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sczd5\" (UniqueName: \"kubernetes.io/projected/689af908-d09d-45db-b624-a3f5f369d6b8-kube-api-access-sczd5\") pod \"must-gather-5c82c\" (UID: \"689af908-d09d-45db-b624-a3f5f369d6b8\") " pod="openshift-must-gather-jrnl5/must-gather-5c82c" Apr 17 20:19:43.818611 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.818568 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrnl5/must-gather-5c82c" Apr 17 20:19:43.951466 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.951433 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrnl5/must-gather-5c82c"] Apr 17 20:19:43.953178 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:19:43.953150 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod689af908_d09d_45db_b624_a3f5f369d6b8.slice/crio-0db60e4919ac4db82a9eae9c7d955b5d48f65d966783b4fa0385ebdcb4a2cb3b WatchSource:0}: Error finding container 0db60e4919ac4db82a9eae9c7d955b5d48f65d966783b4fa0385ebdcb4a2cb3b: Status 404 returned error can't find the container with id 0db60e4919ac4db82a9eae9c7d955b5d48f65d966783b4fa0385ebdcb4a2cb3b Apr 17 20:19:43.954941 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:43.954921 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:19:44.041705 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:44.041614 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/must-gather-5c82c" event={"ID":"689af908-d09d-45db-b624-a3f5f369d6b8","Type":"ContainerStarted","Data":"0db60e4919ac4db82a9eae9c7d955b5d48f65d966783b4fa0385ebdcb4a2cb3b"} Apr 17 20:19:45.050469 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:45.050435 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/must-gather-5c82c" event={"ID":"689af908-d09d-45db-b624-a3f5f369d6b8","Type":"ContainerStarted","Data":"62486bec0c24ed50272365083c431a77bb647989b58751a68a13dc1e8650e778"} Apr 17 20:19:46.076689 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:46.076234 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/must-gather-5c82c" event={"ID":"689af908-d09d-45db-b624-a3f5f369d6b8","Type":"ContainerStarted","Data":"fb4e677a5eafe4024dea6ca8eafa8217bf07308515db4d2f932c919042c8e99f"} Apr 17 20:19:46.092837 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:46.092774 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrnl5/must-gather-5c82c" podStartSLOduration=2.201416415 podStartE2EDuration="3.092753031s" podCreationTimestamp="2026-04-17 20:19:43 +0000 UTC" firstStartedPulling="2026-04-17 20:19:43.955082445 +0000 UTC m=+932.295236777" lastFinishedPulling="2026-04-17 20:19:44.846419064 +0000 UTC m=+933.186573393" observedRunningTime="2026-04-17 20:19:46.090287516 +0000 UTC m=+934.430441880" watchObservedRunningTime="2026-04-17 20:19:46.092753031 +0000 UTC m=+934.432907382" Apr 17 20:19:46.525969 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:46.525936 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hnjc9_63c6975a-1b4b-43cc-9f02-5ad1b5b8b8cc/global-pull-secret-syncer/0.log" Apr 17 20:19:46.620331 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:46.620296 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6bfbp_a591f534-0100-4238-b0cc-81835de74e25/konnectivity-agent/0.log" Apr 17 20:19:46.689717 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:46.689686 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-159.ec2.internal_6e89466f546d82f5fc8e46ec06064587/haproxy/0.log" Apr 17 20:19:51.049285 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.049255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs_8ef65446-9a36-45f7-8828-0e53b27e6918/extract/0.log" Apr 17 20:19:51.072707 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.072508 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs_8ef65446-9a36-45f7-8828-0e53b27e6918/util/0.log" Apr 17 20:19:51.104651 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.104608 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599rdfs_8ef65446-9a36-45f7-8828-0e53b27e6918/pull/0.log" Apr 17 20:19:51.138239 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.138204 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82_d55850c8-6230-4987-a24c-d0e9fc331992/extract/0.log" Apr 17 20:19:51.185289 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.185256 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82_d55850c8-6230-4987-a24c-d0e9fc331992/util/0.log" Apr 17 20:19:51.208320 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.208291 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0wsk82_d55850c8-6230-4987-a24c-d0e9fc331992/pull/0.log" Apr 17 20:19:51.240665 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.240619 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v_202b2da4-5271-41cb-8547-07ddfedd6401/extract/0.log" Apr 17 20:19:51.263305 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.263252 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v_202b2da4-5271-41cb-8547-07ddfedd6401/util/0.log" Apr 17 20:19:51.285543 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.285509 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73z9n5v_202b2da4-5271-41cb-8547-07ddfedd6401/pull/0.log" Apr 17 20:19:51.310634 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.310550 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68_7a058dc6-44e3-4561-9da9-b9904a3f943f/extract/0.log" Apr 17 20:19:51.330973 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.330946 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68_7a058dc6-44e3-4561-9da9-b9904a3f943f/util/0.log" Apr 17 20:19:51.358355 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.358324 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef194b68_7a058dc6-44e3-4561-9da9-b9904a3f943f/pull/0.log" Apr 17 20:19:51.412795 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.412760 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-qp2qt_a3e7d7d2-8c73-41e7-94dc-cd3665d81c04/manager/0.log" Apr 17 20:19:51.438972 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.438943 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-2zchm_acb83e07-8182-4590-9e7b-a1f8ad9cded6/manager/0.log" Apr 17 20:19:51.464044 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.464002 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-nrlc2_013304f1-f25d-431d-9be5-38b6cd661b04/kuadrant-console-plugin/0.log" Apr 17 20:19:51.494946 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.494908 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-xd8dj_ed2e018a-ad4f-479f-9a8c-b68f680c7ca5/registry-server/0.log" Apr 17 20:19:51.562728 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:51.562628 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-qbbdp_d6ae5458-b698-46da-95ff-32c0c3ff0e6f/limitador/0.log" Apr 17 20:19:53.041551 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.041517 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_467b097d-25d9-4d5d-a793-923abf4bc77e/alertmanager/0.log" Apr 17 20:19:53.064180 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.064142 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_467b097d-25d9-4d5d-a793-923abf4bc77e/config-reloader/0.log" Apr 17 20:19:53.086705 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.086674 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_467b097d-25d9-4d5d-a793-923abf4bc77e/kube-rbac-proxy-web/0.log" Apr 17 20:19:53.110563 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.110525 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_467b097d-25d9-4d5d-a793-923abf4bc77e/kube-rbac-proxy/0.log" Apr 17 20:19:53.133248 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.133216 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_467b097d-25d9-4d5d-a793-923abf4bc77e/kube-rbac-proxy-metric/0.log" Apr 17 20:19:53.154918 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.154885 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_467b097d-25d9-4d5d-a793-923abf4bc77e/prom-label-proxy/0.log" Apr 17 20:19:53.177715 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.177685 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_467b097d-25d9-4d5d-a793-923abf4bc77e/init-config-reloader/0.log" Apr 17 20:19:53.245488 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.245459 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2fz8b_d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7/kube-state-metrics/0.log" Apr 17 20:19:53.269921 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.269890 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2fz8b_d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7/kube-rbac-proxy-main/0.log" Apr 17 20:19:53.292146 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.292067 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2fz8b_d1a312e9-fdd6-4c8b-a92a-26bb4d05e5e7/kube-rbac-proxy-self/0.log" Apr 17 20:19:53.579431 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.579295 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w24f4_d8020047-5203-41aa-b91c-7a729e686edb/node-exporter/0.log" Apr 17 20:19:53.602234 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.602207 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w24f4_d8020047-5203-41aa-b91c-7a729e686edb/kube-rbac-proxy/0.log" Apr 17 20:19:53.623557 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.623528 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w24f4_d8020047-5203-41aa-b91c-7a729e686edb/init-textfile/0.log" Apr 17 20:19:53.651960 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.651837 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5pkvm_e756761d-4549-4f85-949c-f03278c10be7/kube-rbac-proxy-main/0.log" Apr 17 20:19:53.674758 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.674720 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5pkvm_e756761d-4549-4f85-949c-f03278c10be7/kube-rbac-proxy-self/0.log" Apr 17 20:19:53.699675 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.699635 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-5pkvm_e756761d-4549-4f85-949c-f03278c10be7/openshift-state-metrics/0.log" Apr 17 20:19:53.976116 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.976083 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76b4cfb785-v8q64_2fdeaaec-d967-4813-a344-1271b1f604b6/telemeter-client/0.log" Apr 17 20:19:53.999272 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:53.999242 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76b4cfb785-v8q64_2fdeaaec-d967-4813-a344-1271b1f604b6/reload/0.log" Apr 17 20:19:54.021067 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:54.021036 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76b4cfb785-v8q64_2fdeaaec-d967-4813-a344-1271b1f604b6/kube-rbac-proxy/0.log" Apr 17 20:19:54.052668 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:54.052644 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66d48c7756-dzkf6_f7e56ce5-d56a-4759-ac37-5ab3784db08c/thanos-query/0.log" Apr 17 20:19:54.075612 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:54.075585 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66d48c7756-dzkf6_f7e56ce5-d56a-4759-ac37-5ab3784db08c/kube-rbac-proxy-web/0.log" Apr 17 20:19:54.102382 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:54.102278 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66d48c7756-dzkf6_f7e56ce5-d56a-4759-ac37-5ab3784db08c/kube-rbac-proxy/0.log" Apr 17 20:19:54.126656 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:54.126624 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66d48c7756-dzkf6_f7e56ce5-d56a-4759-ac37-5ab3784db08c/prom-label-proxy/0.log" Apr 17 20:19:54.150275 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:54.150231 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66d48c7756-dzkf6_f7e56ce5-d56a-4759-ac37-5ab3784db08c/kube-rbac-proxy-rules/0.log" Apr 17 20:19:54.173710 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:54.173682 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-66d48c7756-dzkf6_f7e56ce5-d56a-4759-ac37-5ab3784db08c/kube-rbac-proxy-metrics/0.log" Apr 17 20:19:55.184782 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.184749 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fvm6g_863fd5e8-913c-4fd5-925c-77821846ba00/networking-console-plugin/0.log" Apr 17 20:19:55.263138 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.263082 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9"] Apr 17 20:19:55.270790 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.270754 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.276787 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.276759 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9"] Apr 17 20:19:55.394963 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.394924 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-podres\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.395319 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.395278 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tf7d\" (UniqueName: \"kubernetes.io/projected/6e8faf0d-161e-4cc4-90c2-210cb64f628d-kube-api-access-2tf7d\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.395465 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.395378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-lib-modules\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.396010 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.395980 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-sys\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.396132 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.396048 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-proc\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497343 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497248 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tf7d\" (UniqueName: \"kubernetes.io/projected/6e8faf0d-161e-4cc4-90c2-210cb64f628d-kube-api-access-2tf7d\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497343 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-lib-modules\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497627 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-sys\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497627 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497508 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-proc\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497627 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497583 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-podres\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497804 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497732 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-sys\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497804 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-podres\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497804 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497784 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-proc\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.497963 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.497873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e8faf0d-161e-4cc4-90c2-210cb64f628d-lib-modules\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.507367 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.507321 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tf7d\" (UniqueName: \"kubernetes.io/projected/6e8faf0d-161e-4cc4-90c2-210cb64f628d-kube-api-access-2tf7d\") pod \"perf-node-gather-daemonset-4fgf9\" (UID: \"6e8faf0d-161e-4cc4-90c2-210cb64f628d\") " pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.590285 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.590245 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:55.772606 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.772577 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/1.log" Apr 17 20:19:55.773126 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.773097 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9"] Apr 17 20:19:55.775582 ip-10-0-130-159 kubenswrapper[2568]: W0417 20:19:55.775554 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6e8faf0d_161e_4cc4_90c2_210cb64f628d.slice/crio-b15df95f6eb6bb64a7179182dd54ce0319d2648be299f9d8927a14e2f55b3b17 WatchSource:0}: Error finding container b15df95f6eb6bb64a7179182dd54ce0319d2648be299f9d8927a14e2f55b3b17: Status 404 returned error can't find the container with id b15df95f6eb6bb64a7179182dd54ce0319d2648be299f9d8927a14e2f55b3b17 Apr 17 20:19:55.779286 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:55.779262 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bg9xh_5c003df9-b811-4f77-9d0a-01312bf9421d/console-operator/2.log" Apr 17 20:19:56.134934 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:56.134844 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" event={"ID":"6e8faf0d-161e-4cc4-90c2-210cb64f628d","Type":"ContainerStarted","Data":"a5d28cc19022236cdbfe1f539c758fbea5f74196fc8e1f128565a9a220c8060f"} Apr 17 20:19:56.134934 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:56.134889 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" event={"ID":"6e8faf0d-161e-4cc4-90c2-210cb64f628d","Type":"ContainerStarted","Data":"b15df95f6eb6bb64a7179182dd54ce0319d2648be299f9d8927a14e2f55b3b17"} Apr 17 20:19:56.134934 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:56.134931 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:19:56.155068 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:56.154999 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" podStartSLOduration=1.154978106 podStartE2EDuration="1.154978106s" podCreationTimestamp="2026-04-17 20:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:19:56.151449419 +0000 UTC m=+944.491603768" watchObservedRunningTime="2026-04-17 20:19:56.154978106 +0000 UTC m=+944.495132458" Apr 17 20:19:56.286931 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:56.286901 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7447894dd7-6jt4x_8c2d5320-08a1-4719-ae72-c3c958a1b866/console/0.log" Apr 17 20:19:57.550927 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:57.550898 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nrnx5_ea635e92-8024-48e9-9b19-6fbeddfe380a/dns/0.log" Apr 17 20:19:57.571209 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:57.571178 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-nrnx5_ea635e92-8024-48e9-9b19-6fbeddfe380a/kube-rbac-proxy/0.log" Apr 17 20:19:57.642449 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:57.642419 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-88qdj_316b73ef-655f-4979-a7b1-dcaf0e3bb3ad/dns-node-resolver/0.log" Apr 17 20:19:58.140388 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:58.140340 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8555ccc55b-vmfxx_50f4a009-e41c-46c1-b6f2-bce51899dc5c/registry/0.log" Apr 17 20:19:58.204635 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:58.204610 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gkjdr_7470e9e7-7248-44cf-81a8-fc62c99d05b9/node-ca/0.log" Apr 17 20:19:58.980090 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:58.980062 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f5wwf2_17c4a279-6c87-455d-aa72-5a7d05af451b/istio-proxy/0.log" Apr 17 20:19:59.115307 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:59.115269 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-dgc57_9ed94358-a04c-4cde-88bb-d5f6e8fbaa47/istio-proxy/0.log" Apr 17 20:19:59.645652 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:19:59.645616 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qpnwl_37414adb-2a0d-4af9-93ad-64cc2ea178e7/serve-healthcheck-canary/0.log" Apr 17 20:20:00.109627 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:00.109557 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-674s7_1980268a-97f5-4f06-8170-7ecf507eddf7/insights-operator/0.log" Apr 17 20:20:00.110065 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:00.109674 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-674s7_1980268a-97f5-4f06-8170-7ecf507eddf7/insights-operator/1.log" Apr 17 20:20:00.128886 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:00.128859 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bqprb_4436becd-7f00-417c-82b8-a06a4171ec21/kube-rbac-proxy/0.log" Apr 17 20:20:00.149157 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:00.149131 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bqprb_4436becd-7f00-417c-82b8-a06a4171ec21/exporter/0.log" Apr 17 20:20:00.169299 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:00.169274 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bqprb_4436becd-7f00-417c-82b8-a06a4171ec21/extractor/0.log" Apr 17 20:20:02.151202 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:02.151171 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrnl5/perf-node-gather-daemonset-4fgf9" Apr 17 20:20:02.226340 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:02.226303 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bcb6fdd5f-djjdn_8064ee4e-f005-441a-8dcf-9dae03e7d1b1/manager/0.log" Apr 17 20:20:03.433536 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:03.433492 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fcb6f8ffb-qgsgg_ad2e17ec-813d-417b-a08f-a8a0ac70771d/manager/0.log" Apr 17 20:20:09.257924 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.257891 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-245q6_8beb86a8-efdf-4bba-8697-baf00c6854af/kube-multus-additional-cni-plugins/0.log" Apr 17 20:20:09.280648 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.280615 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-245q6_8beb86a8-efdf-4bba-8697-baf00c6854af/egress-router-binary-copy/0.log" Apr 17 20:20:09.303900 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.303869 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-245q6_8beb86a8-efdf-4bba-8697-baf00c6854af/cni-plugins/0.log" Apr 17 20:20:09.328383 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.328349 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-245q6_8beb86a8-efdf-4bba-8697-baf00c6854af/bond-cni-plugin/0.log" Apr 17 20:20:09.354275 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.354245 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-245q6_8beb86a8-efdf-4bba-8697-baf00c6854af/routeoverride-cni/0.log" Apr 17 20:20:09.376573 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.376548 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-245q6_8beb86a8-efdf-4bba-8697-baf00c6854af/whereabouts-cni-bincopy/0.log" Apr 17 20:20:09.397506 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.397479 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-245q6_8beb86a8-efdf-4bba-8697-baf00c6854af/whereabouts-cni/0.log" Apr 17 20:20:09.885048 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.885016 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rxx56_648545b8-d5e2-4491-9d4d-e78f3052aefb/kube-multus/0.log" Apr 17 20:20:09.907725 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.907699 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2ctfd_a3207e4f-83f5-4913-a57e-c29dd6aed2df/network-metrics-daemon/0.log" Apr 17 20:20:09.929140 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:09.929107 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2ctfd_a3207e4f-83f5-4913-a57e-c29dd6aed2df/kube-rbac-proxy/0.log" Apr 17 20:20:10.873911 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:10.873875 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-controller/0.log" Apr 17 20:20:10.891999 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:10.891968 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/0.log" Apr 17 20:20:10.897116 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:10.897089 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovn-acl-logging/1.log" Apr 17 20:20:10.916713 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:10.916665 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/kube-rbac-proxy-node/0.log" Apr 17 20:20:10.939265 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:10.939225 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 20:20:10.959627 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:10.959593 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/northd/0.log" Apr 17 20:20:10.982227 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:10.982199 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/nbdb/0.log" Apr 17 20:20:11.008202 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:11.008174 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/sbdb/0.log" Apr 17 20:20:11.116159 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:11.116113 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jqvld_32f70982-1fda-48ca-bbf7-530ff3957212/ovnkube-controller/0.log" Apr 17 20:20:12.735011 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:12.734983 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-q6svq_f96a9e59-b624-4850-ab85-b3968aa4f8b3/check-endpoints/0.log" Apr 17 20:20:12.792981 ip-10-0-130-159 kubenswrapper[2568]: I0417 20:20:12.792946 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-x94f6_9cf5bd58-f267-46f8-9af8-24426ecf56e0/network-check-target-container/0.log"