Apr 20 19:08:13.645615 ip-10-0-139-126 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:08:14.098748 ip-10-0-139-126 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:08:14.098748 ip-10-0-139-126 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:08:14.098748 ip-10-0-139-126 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:08:14.098748 ip-10-0-139-126 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:08:14.098748 ip-10-0-139-126 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:08:14.101912 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.101819 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:08:14.105991 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.105975 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:14.105991 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.105993 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.105996 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106000 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106003 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106006 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106008 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106011 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106014 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106016 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106019 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106022 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106025 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106027 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106030 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106039 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106042 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106044 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106047 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106050 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106052 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:14.106057 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106055 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106058 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106062 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106065 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106067 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106070 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106073 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106076 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106078 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106081 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106083 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106086 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106088 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106090 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106093 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106095 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106098 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106101 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106103 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106105 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:14.106550 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106108 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106110 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106113 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106115 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106117 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106120 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106123 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106125 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106127 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106130 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106132 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106135 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106137 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106140 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106143 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106145 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106148 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106151 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106154 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106156 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:14.107032 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106158 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106161 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106163 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106166 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106169 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106172 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106176 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106178 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106181 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106183 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106186 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106189 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106192 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106194 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106196 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106199 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106202 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106206 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106210 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106213 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:14.107566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106216 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106219 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106222 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106225 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106229 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106634 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106640 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106644 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106647 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106649 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106652 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106655 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106657 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106660 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106663 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106665 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106668 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106671 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106673 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:14.108151 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106676 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106678 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106681 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106683 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106687 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106689 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106692 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106694 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106697 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106699 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106702 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106705 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106707 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106709 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106712 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106714 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106717 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106720 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106723 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106725 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:14.108640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106728 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106730 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106733 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106736 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106738 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106741 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106743 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106745 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106748 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106750 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106753 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106756 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106758 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106761 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106763 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106765 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106768 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106771 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106773 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106776 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:14.109140 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106778 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106780 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106783 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106785 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106789 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106793 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106795 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106797 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106800 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106803 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106805 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106808 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106810 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106812 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106815 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106817 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106820 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106822 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106825 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106828 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:14.109634 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106830 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106833 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106836 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106839 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106842 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106845 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106847 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106850 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106853 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106855 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106857 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.106860 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108427 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108438 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108444 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108448 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108453 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108456 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108461 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108466 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108469 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:08:14.110137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108472 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108475 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108479 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108482 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108486 2583 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108488 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108491 2583 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108494 2583 flags.go:64] FLAG: --cloud-config="" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108497 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108499 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108505 2583 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108508 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108511 2583 flags.go:64] FLAG: --config-dir="" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108514 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108517 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108521 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108524 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108527 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108531 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108534 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108537 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108540 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108544 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108547 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108551 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:08:14.110661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108554 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108557 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108560 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108563 2583 flags.go:64] FLAG: --enable-server="true" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108566 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108570 2583 flags.go:64] FLAG: --event-burst="100" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108573 2583 flags.go:64] FLAG: --event-qps="50" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108576 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108580 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108583 2583 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108586 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108589 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108592 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108595 2583 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108599 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108601 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108604 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108607 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108610 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108613 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108616 2583 flags.go:64] FLAG: --feature-gates="" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108620 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108623 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108626 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108630 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108633 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:08:14.111261 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108636 2583 flags.go:64] FLAG: --help="false" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108639 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108642 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108645 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108648 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108652 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108655 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108659 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108662 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108664 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108667 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108670 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108673 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108676 2583 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108679 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108681 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108685 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108687 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108690 2583 flags.go:64] FLAG: --lock-file="" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108693 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108696 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108698 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108704 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:08:14.111896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108707 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108710 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108713 2583 flags.go:64] FLAG: --logging-format="text" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108716 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108720 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108723 2583 flags.go:64] FLAG: --manifest-url="" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108726 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108735 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108739 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108743 2583 flags.go:64] FLAG: --max-pods="110" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108746 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108749 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108752 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108755 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108758 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108760 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108763 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108771 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108774 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108777 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108780 2583 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108782 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108788 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108795 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:08:14.112499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108798 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108801 2583 flags.go:64] FLAG: --port="10250" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108804 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108807 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bc99d3924e1a934b" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108810 2583 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108813 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108816 2583 flags.go:64] FLAG: --register-node="true" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108819 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108822 2583 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108826 2583 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108829 2583 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108831 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108834 2583 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108838 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108841 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108844 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108848 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108851 2583 flags.go:64] FLAG: --runonce="false" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108854 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108857 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108860 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108862 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108865 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108868 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108871 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108874 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:08:14.113100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108877 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108879 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108883 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108885 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108888 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108893 2583 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108896 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108901 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108904 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108907 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108912 2583 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108915 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108918 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108921 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108924 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108927 2583 flags.go:64] FLAG: --v="2" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108931 2583 flags.go:64] FLAG: --version="false" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108935 2583 flags.go:64] FLAG: --vmodule="" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108939 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.108943 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109060 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109064 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109068 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109071 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:14.113730 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109074 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109077 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109080 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109083 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109086 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109089 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109091 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109094 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109097 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109100 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109102 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109105 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109109 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109115 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109118 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109121 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109123 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109126 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109128 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109131 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:14.114297 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109133 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109136 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109138 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109143 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109146 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109150 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109154 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109157 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109159 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109163 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109165 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109168 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109170 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109173 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109175 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109178 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109180 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109183 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109186 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:14.114802 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109188 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109190 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109193 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109195 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109198 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109200 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109204 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109207 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109209 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109212 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109214 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109219 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109222 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109225 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109227 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109230 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109234 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109236 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109239 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109242 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:14.115263 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109244 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109247 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109249 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109252 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109254 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109257 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109259 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109262 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109265 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109267 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109269 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109272 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109274 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109277 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109279 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109282 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109284 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109286 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109290 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109292 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:14.115786 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109295 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:14.116299 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109297 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:14.116299 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.109300 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:14.116299 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.109318 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:08:14.116833 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.116813 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:08:14.116867 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.116834 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:08:14.116899 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116882 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:14.116899 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116886 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:14.116899 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116890 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:14.116899 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116893 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:14.116899 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116896 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:14.116899 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116899 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:14.116899 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116902 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116906 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116923 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116928 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116931 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116934 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116936 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116940 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116943 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116946 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116949 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116952 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116955 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116959 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116963 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116966 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116968 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116971 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116974 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:14.117070 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116977 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116980 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116982 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116985 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116988 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116990 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116993 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116996 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.116999 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117002 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117005 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117008 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117010 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117013 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117015 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117018 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117021 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117024 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117026 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117029 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:14.117555 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117031 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117034 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117037 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117040 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117042 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117044 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117047 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117049 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117052 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117055 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117057 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117060 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117062 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117065 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117067 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117070 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117072 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117075 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117077 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117080 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:14.118031 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117083 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117086 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117090 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117093 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117095 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117098 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117101 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117103 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117106 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117109 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117111 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117114 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117116 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117119 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117122 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117124 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117127 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117129 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117132 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:14.118552 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117135 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117137 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.117142 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117244 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117249 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117252 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117255 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117258 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117261 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117265 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117268 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117271 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117274 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117277 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117280 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:14.119013 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117283 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117285 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117288 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117291 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117294 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117297 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117300 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117321 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117324 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117327 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117329 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117332 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117335 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117339 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117341 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117344 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117346 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117349 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117351 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117354 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:14.119413 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117356 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117359 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117361 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117364 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117367 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117369 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117371 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117374 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117376 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117379 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117382 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117385 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117387 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117390 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117392 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117395 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117397 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117400 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117402 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117404 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:14.119909 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117407 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117409 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117412 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117414 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117416 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117419 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117422 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117424 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117427 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117429 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117432 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117434 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117437 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117439 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117442 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117445 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117447 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117450 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117452 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:14.120423 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117454 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117457 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117459 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117462 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117464 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117467 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117469 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117472 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117474 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117476 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117479 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117481 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117483 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117486 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:14.117488 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.117493 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:08:14.120883 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.118298 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:08:14.121272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.120449 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:08:14.121756 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.121745 2583 server.go:1019] "Starting client certificate rotation" Apr 20 19:08:14.121856 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.121839 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:08:14.121891 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.121882 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:08:14.147184 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.147158 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:08:14.152507 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.152477 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:08:14.170701 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.170675 2583 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:08:14.177374 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.177348 2583 log.go:25] "Validated CRI v1 image API" Apr 20 19:08:14.178608 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.178589 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:08:14.179810 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.179783 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:08:14.184513 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.184485 2583 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 f31d3d13-6031-454e-b742-588bdbaa8689:/dev/nvme0n1p4 fcdfcedf-b9a5-480c-8d87-9cbf128eba0b:/dev/nvme0n1p3] Apr 20 19:08:14.184611 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.184512 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:08:14.191356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.191210 2583 manager.go:217] Machine: {Timestamp:2026-04-20 19:08:14.188289478 +0000 UTC m=+0.420579978 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3016241 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28f6dd59d7c92c652da25adbbcc56b SystemUUID:ec28f6dd-59d7-c92c-652d-a25adbbcc56b BootID:bf4c63bb-b4e7-43d6-85d8-4768023797fa Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cf:1e:c3:1c:8f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cf:1e:c3:1c:8f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:de:ab:ff:7a:ec Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:08:14.191356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.191344 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:08:14.191530 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.191464 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:08:14.192591 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.192565 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:08:14.192766 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.192593 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-126.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:08:14.192853 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.192780 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:08:14.192853 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.192794 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:08:14.192853 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.192813 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:08:14.193409 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.193396 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:08:14.194603 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.194590 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:08:14.194911 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.194898 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:08:14.197392 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.197379 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:08:14.197466 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.197399 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:08:14.197466 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.197417 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:08:14.197466 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.197432 2583 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:08:14.197466 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.197445 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:08:14.198501 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.198488 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:08:14.198569 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.198511 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:08:14.200492 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.200474 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b8kx9" Apr 20 19:08:14.201513 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.201493 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:08:14.203356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.203341 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:08:14.205505 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205489 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:08:14.205572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205513 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:08:14.205572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205522 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:08:14.205572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205531 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:08:14.205572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205540 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:08:14.205572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205548 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:08:14.205572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205556 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:08:14.205572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205564 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:08:14.205572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205575 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:08:14.205794 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205585 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:08:14.205794 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205596 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:08:14.205794 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.205609 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:08:14.206076 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.206040 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-126.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 19:08:14.206229 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.206211 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 19:08:14.207495 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.207483 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:08:14.207545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.207497 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:08:14.210018 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.210003 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-126.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 19:08:14.211367 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.211355 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:08:14.211420 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.211396 2583 server.go:1295] "Started kubelet" Apr 20 19:08:14.211545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.211494 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:08:14.211585 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.211515 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:08:14.212623 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.212598 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:08:14.213210 ip-10-0-139-126 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:08:14.214849 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.214240 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:08:14.214849 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.214288 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b8kx9" Apr 20 19:08:14.216765 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.216750 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:08:14.222929 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.222909 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 19:08:14.224405 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.224385 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:08:14.224405 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.224398 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:08:14.225116 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225096 2583 factory.go:55] Registering systemd factory Apr 20 19:08:14.225116 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225112 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:08:14.225256 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225120 2583 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:08:14.225256 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225112 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:08:14.225256 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225162 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:08:14.225256 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225248 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:08:14.225256 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225257 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:08:14.225513 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.225489 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:14.225586 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225573 2583 factory.go:153] Registering CRI-O factory Apr 20 19:08:14.225635 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225588 2583 factory.go:223] Registration of the crio container factory successfully Apr 20 19:08:14.225635 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225634 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:08:14.225733 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225654 2583 factory.go:103] Registering Raw factory Apr 20 19:08:14.225733 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225665 2583 manager.go:1196] Started watching for new ooms in manager Apr 20 19:08:14.225991 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.225980 2583 manager.go:319] Starting recovery of all containers Apr 20 19:08:14.233277 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.233101 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:14.235605 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.235582 2583 manager.go:324] Recovery completed Apr 20 19:08:14.236433 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.236410 2583 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-126.ec2.internal\" not found" node="ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.239571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.239556 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:14.242101 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.242084 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:14.242162 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.242117 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:14.242162 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.242127 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:14.242666 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.242650 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:08:14.242666 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.242664 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:08:14.242769 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.242681 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:08:14.245742 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.245728 2583 policy_none.go:49] "None policy: Start" Apr 20 19:08:14.245824 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.245746 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:08:14.245824 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.245757 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:08:14.281926 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.281906 2583 manager.go:341] "Starting Device Plugin manager" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.281965 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.281981 2583 server.go:85] "Starting device plugin registration server" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.282269 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.282280 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.282378 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.282460 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.282470 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.283045 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:08:14.294122 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.283086 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:14.354884 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.354794 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:08:14.356339 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.356319 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:08:14.356438 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.356347 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:08:14.356438 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.356367 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:08:14.356438 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.356375 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:08:14.356438 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.356413 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:08:14.360686 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.360666 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:14.382981 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.382962 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:14.384181 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.384166 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:14.384245 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.384196 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:14.384245 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.384206 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:14.384245 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.384231 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.393817 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.393800 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.393887 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.393825 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-126.ec2.internal\": node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:14.407674 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.407651 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:14.456822 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.456773 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal"] Apr 20 19:08:14.456935 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.456873 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:14.457932 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.457915 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:14.458035 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.457952 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:14.458035 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.457967 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:14.459410 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.459386 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:14.459566 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.459550 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.459642 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.459588 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:14.460123 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.460107 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:14.460186 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.460138 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:14.460186 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.460152 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:14.460244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.460114 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:14.460244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.460208 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:14.460244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.460219 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:14.461278 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.461265 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.461381 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.461287 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:14.461988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.461969 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:14.462078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.461995 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:14.462078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.462010 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:14.490353 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.490324 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-126.ec2.internal\" not found" node="ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.494749 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.494729 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-126.ec2.internal\" not found" node="ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.508759 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.508731 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:14.609686 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.609604 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:14.626360 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.626335 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ff307b2820fbf179d500677369dc91d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal\" (UID: \"9ff307b2820fbf179d500677369dc91d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.626464 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.626365 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ff307b2820fbf179d500677369dc91d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal\" (UID: \"9ff307b2820fbf179d500677369dc91d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.626464 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.626388 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec9ad27b137077b1d073e8b7bc68876b-config\") pod \"kube-apiserver-proxy-ip-10-0-139-126.ec2.internal\" (UID: \"ec9ad27b137077b1d073e8b7bc68876b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.710749 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.710716 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:14.727078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.727046 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ff307b2820fbf179d500677369dc91d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal\" (UID: \"9ff307b2820fbf179d500677369dc91d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.727078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.727081 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ff307b2820fbf179d500677369dc91d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal\" (UID: \"9ff307b2820fbf179d500677369dc91d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.727224 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.727100 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec9ad27b137077b1d073e8b7bc68876b-config\") pod \"kube-apiserver-proxy-ip-10-0-139-126.ec2.internal\" (UID: \"ec9ad27b137077b1d073e8b7bc68876b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.727224 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.727143 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ff307b2820fbf179d500677369dc91d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal\" (UID: \"9ff307b2820fbf179d500677369dc91d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.727224 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.727155 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ff307b2820fbf179d500677369dc91d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal\" (UID: \"9ff307b2820fbf179d500677369dc91d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.727224 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.727215 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec9ad27b137077b1d073e8b7bc68876b-config\") pod \"kube-apiserver-proxy-ip-10-0-139-126.ec2.internal\" (UID: \"ec9ad27b137077b1d073e8b7bc68876b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.792242 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.792198 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.798068 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:14.798049 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" Apr 20 19:08:14.811593 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.811568 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:14.912248 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:14.912159 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:15.012663 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:15.012627 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:15.113174 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:15.113145 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:15.121317 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.121296 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:08:15.121467 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.121452 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:08:15.121506 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.121473 2583 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:08:15.213684 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:15.213624 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:15.216933 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.216913 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:15.218128 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.218098 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:03:14 +0000 UTC" deadline="2027-12-06 19:26:47.428886905 +0000 UTC" Apr 20 19:08:15.218128 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.218126 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14280h18m32.210763994s" Apr 20 19:08:15.225157 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.225137 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:08:15.237512 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.237495 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:08:15.266348 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.266298 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2d7bd" Apr 20 19:08:15.275164 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.275141 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2d7bd" Apr 20 19:08:15.314711 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:15.314679 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:15.330953 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:15.330923 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff307b2820fbf179d500677369dc91d.slice/crio-426496e6daecd562317ea8e5bd3e08e94045f3ea1ab88e74d4257c45f1b4770b WatchSource:0}: Error finding container 426496e6daecd562317ea8e5bd3e08e94045f3ea1ab88e74d4257c45f1b4770b: Status 404 returned error can't find the container with id 426496e6daecd562317ea8e5bd3e08e94045f3ea1ab88e74d4257c45f1b4770b Apr 20 19:08:15.331362 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:15.331336 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9ad27b137077b1d073e8b7bc68876b.slice/crio-d99db1c26f3f53fdc84cfabfdb7e3a2910c1ba656bdb9f633f0459f8bbcdeb3d WatchSource:0}: Error finding container d99db1c26f3f53fdc84cfabfdb7e3a2910c1ba656bdb9f633f0459f8bbcdeb3d: Status 404 returned error can't find the container with id d99db1c26f3f53fdc84cfabfdb7e3a2910c1ba656bdb9f633f0459f8bbcdeb3d Apr 20 19:08:15.336172 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.336153 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:08:15.359071 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.359025 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" event={"ID":"9ff307b2820fbf179d500677369dc91d","Type":"ContainerStarted","Data":"426496e6daecd562317ea8e5bd3e08e94045f3ea1ab88e74d4257c45f1b4770b"} Apr 20 19:08:15.359949 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.359924 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" event={"ID":"ec9ad27b137077b1d073e8b7bc68876b","Type":"ContainerStarted","Data":"d99db1c26f3f53fdc84cfabfdb7e3a2910c1ba656bdb9f633f0459f8bbcdeb3d"} Apr 20 19:08:15.415123 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:15.415099 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:15.515699 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:15.515609 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-126.ec2.internal\" not found" Apr 20 19:08:15.597023 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.596996 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:15.625167 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.625139 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" Apr 20 19:08:15.637167 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.637141 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:08:15.638054 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.638038 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" Apr 20 19:08:15.653400 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.653378 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:08:15.928702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:15.928631 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:16.198736 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.198649 2583 apiserver.go:52] "Watching apiserver" Apr 20 19:08:16.218010 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.217902 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:08:16.219840 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.219811 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-kv9j6","openshift-dns/node-resolver-qp25w","openshift-image-registry/node-ca-nd8bd","openshift-network-diagnostics/network-check-target-xb2cz","openshift-ovn-kubernetes/ovnkube-node-46gfp","kube-system/konnectivity-agent-wnxqf","kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal","openshift-multus/multus-additional-cni-plugins-m9kdg","openshift-multus/multus-pm94g","openshift-multus/network-metrics-daemon-9gbcz","openshift-network-operator/iptables-alerter-jbf6q"] Apr 20 19:08:16.222263 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.222239 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.225577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.224822 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.225577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.224859 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:08:16.225577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.225214 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jzjqs\"" Apr 20 19:08:16.225577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.225230 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:08:16.225577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.225277 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:08:16.225577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.225286 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:08:16.225577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.225450 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:08:16.226103 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.226081 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.226229 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.226177 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:16.226292 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.226245 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:16.227399 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.227379 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:08:16.227495 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.227422 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:08:16.227589 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.227574 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-45mgx\"" Apr 20 19:08:16.227928 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.227908 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.228498 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.228478 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:08:16.229263 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.228850 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:08:16.229263 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.229045 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:08:16.229263 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.229048 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:16.229263 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.229094 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4thbx\"" Apr 20 19:08:16.230184 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.230166 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.230832 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.230812 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:08:16.230925 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.230897 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:08:16.231269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.231251 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:08:16.231577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.231561 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.233004 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.232985 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.234206 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234047 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:08:16.234320 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234284 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:16.234410 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.234360 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:16.234451 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234425 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8adeca3e-66a6-48a4-81e8-13898bdffa54-host\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.234485 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234455 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-cnibin\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.234520 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234481 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-os-release\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.234571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234549 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.234655 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234585 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.234655 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234613 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/406ba3b2-33e1-4cc9-941c-55a06a114e38-hosts-file\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.234655 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234649 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c2ln\" (UniqueName: \"kubernetes.io/projected/406ba3b2-33e1-4cc9-941c-55a06a114e38-kube-api-access-9c2ln\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.234811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234695 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8adeca3e-66a6-48a4-81e8-13898bdffa54-serviceca\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.234811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234726 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:16.234811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234770 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.234951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234813 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngr4\" (UniqueName: \"kubernetes.io/projected/0eb04f4c-6eed-4f33-ae86-5126283315de-kube-api-access-cngr4\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.234951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234843 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/406ba3b2-33e1-4cc9-941c-55a06a114e38-tmp-dir\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.234951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234867 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xtj5\" (UniqueName: \"kubernetes.io/projected/8adeca3e-66a6-48a4-81e8-13898bdffa54-kube-api-access-2xtj5\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.234951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234873 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:08:16.234951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234875 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:08:16.234951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234929 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:08:16.234951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234941 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-system-cni-dir\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.234951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234946 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:08:16.235336 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.234984 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-cni-binary-copy\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.235434 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.235415 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:08:16.235485 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.235454 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:08:16.235537 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.235417 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6lv2k\"" Apr 20 19:08:16.235657 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.235638 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.236120 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.236081 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:08:16.236212 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.236135 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-44k7p\"" Apr 20 19:08:16.236212 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.236165 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j2hm6\"" Apr 20 19:08:16.236350 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.236277 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:08:16.236440 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.236424 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:08:16.236525 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.236511 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:08:16.236604 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.236588 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wgvsw\"" Apr 20 19:08:16.236757 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.236742 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-crld4\"" Apr 20 19:08:16.237914 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.237894 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:08:16.238100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.238082 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:08:16.238192 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.238176 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:08:16.239791 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.239771 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rwvq8\"" Apr 20 19:08:16.276035 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.276004 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:03:15 +0000 UTC" deadline="2027-10-07 10:54:57.492338743 +0000 UTC" Apr 20 19:08:16.276035 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.276034 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12831h46m41.216308137s" Apr 20 19:08:16.301434 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.301405 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:16.326037 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.326013 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:08:16.335210 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335183 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-kubernetes\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.335357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335231 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8adeca3e-66a6-48a4-81e8-13898bdffa54-host\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.335357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335259 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-cnibin\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.335357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335323 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-var-lib-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.335511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335366 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8adeca3e-66a6-48a4-81e8-13898bdffa54-host\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.335511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335366 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-node-log\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.335511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335328 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-cnibin\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.335511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335405 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-modprobe-d\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.335511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335436 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-cnibin\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.335511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335458 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-os-release\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.335511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335481 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-cni-netd\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.335511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335506 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335534 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-host\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335572 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-cni-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335635 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-cni-binary-copy\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335679 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-hostroot\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335714 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-multus-certs\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335749 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/846d14a0-81fa-4701-9219-6c5631b28c34-agent-certs\") pod \"konnectivity-agent-wnxqf\" (UID: \"846d14a0-81fa-4701-9219-6c5631b28c34\") " pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335774 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1c7034ae-80c0-4dd1-9098-c92022ad516a-iptables-alerter-script\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335797 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-systemd-units\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335836 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.335871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335867 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-run\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335893 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-var-lib-kubelet\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335918 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-sys\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335945 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335961 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c2ln\" (UniqueName: \"kubernetes.io/projected/406ba3b2-33e1-4cc9-941c-55a06a114e38-kube-api-access-9c2ln\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335976 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1c7034ae-80c0-4dd1-9098-c92022ad516a-host-slash\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.335990 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-sys-fs\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336036 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-netns\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336069 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68z7q\" (UniqueName: \"kubernetes.io/projected/8cbf6223-eb9e-4c30-9be7-c289ced45992-kube-api-access-68z7q\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336098 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-log-socket\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336122 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-device-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336146 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysctl-conf\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336168 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-lib-modules\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336213 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-tuned\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.336269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336272 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-k8s-cni-cncf-io\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336321 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-kubelet\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336346 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-conf-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336373 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-etc-kubernetes\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336413 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn9f6\" (UniqueName: \"kubernetes.io/projected/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-kube-api-access-pn9f6\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336450 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-os-release\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336476 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-run-netns\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336501 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovnkube-config\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336524 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-tmp\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336533 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-os-release\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336548 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-cni-multus\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336581 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336586 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8adeca3e-66a6-48a4-81e8-13898bdffa54-serviceca\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336627 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cngr4\" (UniqueName: \"kubernetes.io/projected/0eb04f4c-6eed-4f33-ae86-5126283315de-kube-api-access-cngr4\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336715 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/406ba3b2-33e1-4cc9-941c-55a06a114e38-tmp-dir\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.336878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336827 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr48g\" (UniqueName: \"kubernetes.io/projected/1c7034ae-80c0-4dd1-9098-c92022ad516a-kube-api-access-jr48g\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336909 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-etc-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336932 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8adeca3e-66a6-48a4-81e8-13898bdffa54-serviceca\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336949 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovnkube-script-lib\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.336980 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-registration-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337003 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-socket-dir-parent\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337023 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xtj5\" (UniqueName: \"kubernetes.io/projected/8adeca3e-66a6-48a4-81e8-13898bdffa54-kube-api-access-2xtj5\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337078 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/846d14a0-81fa-4701-9219-6c5631b28c34-konnectivity-ca\") pod \"konnectivity-agent-wnxqf\" (UID: \"846d14a0-81fa-4701-9219-6c5631b28c34\") " pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337092 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/406ba3b2-33e1-4cc9-941c-55a06a114e38-tmp-dir\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337125 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-cni-bin\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337167 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-cni-binary-copy\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337228 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-kubelet\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337296 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-slash\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337341 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-cni-bin\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337374 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-socket-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337398 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysconfig\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337433 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.337532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337489 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/406ba3b2-33e1-4cc9-941c-55a06a114e38-hosts-file\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337527 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337544 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337546 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-env-overrides\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337592 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-systemd\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337600 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/406ba3b2-33e1-4cc9-941c-55a06a114e38-hosts-file\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337600 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-cni-binary-copy\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337617 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh74q\" (UniqueName: \"kubernetes.io/projected/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-kube-api-access-vh74q\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337644 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-system-cni-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337670 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337696 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337726 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-525lt\" (UniqueName: \"kubernetes.io/projected/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-kube-api-access-525lt\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337751 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-systemd\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337773 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-ovn\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337788 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-run-ovn-kubernetes\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337805 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.338137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337863 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gllrj\" (UniqueName: \"kubernetes.io/projected/141e70e5-9790-4051-898b-ed3910ef3927-kube-api-access-gllrj\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.338709 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337883 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysctl-d\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.338709 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337905 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-system-cni-dir\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.338709 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337950 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:16.338709 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337985 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0eb04f4c-6eed-4f33-ae86-5126283315de-system-cni-dir\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.338709 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.337984 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovn-node-metrics-cert\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.338709 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.338021 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-daemon-config\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.338709 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.338089 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0eb04f4c-6eed-4f33-ae86-5126283315de-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.345832 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.345805 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:08:16.346224 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.346198 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:16.346336 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.346227 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:16.346336 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.346242 2583 projected.go:194] Error preparing data for projected volume kube-api-access-frsq9 for pod openshift-network-diagnostics/network-check-target-xb2cz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:16.346496 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.346391 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9 podName:d79586bb-910e-443a-be9f-96dd10cd1d31 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:16.846359496 +0000 UTC m=+3.078650002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-frsq9" (UniqueName: "kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9") pod "network-check-target-xb2cz" (UID: "d79586bb-910e-443a-be9f-96dd10cd1d31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:16.349955 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.349929 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngr4\" (UniqueName: \"kubernetes.io/projected/0eb04f4c-6eed-4f33-ae86-5126283315de-kube-api-access-cngr4\") pod \"multus-additional-cni-plugins-m9kdg\" (UID: \"0eb04f4c-6eed-4f33-ae86-5126283315de\") " pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.350048 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.349929 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xtj5\" (UniqueName: \"kubernetes.io/projected/8adeca3e-66a6-48a4-81e8-13898bdffa54-kube-api-access-2xtj5\") pod \"node-ca-nd8bd\" (UID: \"8adeca3e-66a6-48a4-81e8-13898bdffa54\") " pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.350271 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.350254 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c2ln\" (UniqueName: \"kubernetes.io/projected/406ba3b2-33e1-4cc9-941c-55a06a114e38-kube-api-access-9c2ln\") pod \"node-resolver-qp25w\" (UID: \"406ba3b2-33e1-4cc9-941c-55a06a114e38\") " pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.438640 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438605 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/846d14a0-81fa-4701-9219-6c5631b28c34-konnectivity-ca\") pod \"konnectivity-agent-wnxqf\" (UID: \"846d14a0-81fa-4701-9219-6c5631b28c34\") " pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:16.438814 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438657 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-cni-bin\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.438814 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438682 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-kubelet\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.438814 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438705 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-slash\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.438814 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438730 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-cni-bin\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.438814 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438783 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-cni-bin\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.438814 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438788 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-kubelet\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438810 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-cni-bin\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438825 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-socket-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438794 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-slash\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438874 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysconfig\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438894 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438912 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-env-overrides\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438929 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysconfig\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438932 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-systemd\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438947 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-socket-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438974 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.438976 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vh74q\" (UniqueName: \"kubernetes.io/projected/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-kube-api-access-vh74q\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439020 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-system-cni-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439067 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-systemd\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439075 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-525lt\" (UniqueName: \"kubernetes.io/projected/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-kube-api-access-525lt\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:16.439095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439099 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/846d14a0-81fa-4701-9219-6c5631b28c34-konnectivity-ca\") pod \"konnectivity-agent-wnxqf\" (UID: \"846d14a0-81fa-4701-9219-6c5631b28c34\") " pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439102 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-systemd\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439127 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-system-cni-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439149 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-ovn\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439169 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-systemd\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439176 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-run-ovn-kubernetes\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439200 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439210 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-ovn\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439225 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gllrj\" (UniqueName: \"kubernetes.io/projected/141e70e5-9790-4051-898b-ed3910ef3927-kube-api-access-gllrj\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439244 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-run-ovn-kubernetes\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439249 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysctl-d\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439277 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439300 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovn-node-metrics-cert\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439329 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-env-overrides\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439343 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-daemon-config\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439342 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-etc-selinux\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439371 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-kubernetes\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439391 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-var-lib-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.439545 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.439403 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439437 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-node-log\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439444 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysctl-d\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439406 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-node-log\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439469 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-kubernetes\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.439482 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:16.939455786 +0000 UTC m=+3.171746292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439511 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-var-lib-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439563 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-modprobe-d\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439590 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-cnibin\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439613 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-os-release\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439644 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-cni-netd\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439668 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439678 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-cnibin\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439691 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-host\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439727 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-cni-netd\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439727 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-modprobe-d\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439735 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-host\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.440473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439753 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-cni-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439771 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439780 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-cni-binary-copy\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439807 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-cni-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439808 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-hostroot\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439822 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-os-release\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439834 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-hostroot\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439850 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-multus-certs\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439868 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-daemon-config\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439877 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/846d14a0-81fa-4701-9219-6c5631b28c34-agent-certs\") pod \"konnectivity-agent-wnxqf\" (UID: \"846d14a0-81fa-4701-9219-6c5631b28c34\") " pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439901 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1c7034ae-80c0-4dd1-9098-c92022ad516a-iptables-alerter-script\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439914 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-multus-certs\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439925 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-systemd-units\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.439959 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-systemd-units\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440025 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440049 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-run\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440072 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-var-lib-kubelet\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440096 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-sys\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.441238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440124 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1c7034ae-80c0-4dd1-9098-c92022ad516a-host-slash\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-sys-fs\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440167 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-netns\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440191 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-cni-binary-copy\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440191 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68z7q\" (UniqueName: \"kubernetes.io/projected/8cbf6223-eb9e-4c30-9be7-c289ced45992-kube-api-access-68z7q\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440234 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-log-socket\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440246 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-run-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440265 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-device-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440270 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-sys\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440290 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysctl-conf\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440338 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-lib-modules\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440355 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-sys-fs\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440363 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-tuned\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440378 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-var-lib-kubelet\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440388 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-k8s-cni-cncf-io\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440399 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-netns\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440292 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1c7034ae-80c0-4dd1-9098-c92022ad516a-host-slash\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440432 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-run-k8s-cni-cncf-io\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442047 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440340 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-run\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440454 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1c7034ae-80c0-4dd1-9098-c92022ad516a-iptables-alerter-script\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440478 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-log-socket\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440516 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-lib-modules\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440564 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-device-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440599 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-kubelet\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440617 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-conf-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440633 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-etc-kubernetes\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440659 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn9f6\" (UniqueName: \"kubernetes.io/projected/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-kube-api-access-pn9f6\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440674 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-run-netns\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440679 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-sysctl-conf\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440690 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovnkube-config\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440691 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-conf-dir\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440710 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-tmp\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440726 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-cni-multus\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440731 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-etc-kubernetes\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440734 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-kubelet\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440744 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr48g\" (UniqueName: \"kubernetes.io/projected/1c7034ae-80c0-4dd1-9098-c92022ad516a-kube-api-access-jr48g\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.442596 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440845 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-host-run-netns\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.440894 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-host-var-lib-cni-multus\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.441061 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-etc-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.441112 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovnkube-script-lib\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.441138 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-registration-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.441161 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-socket-dir-parent\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.441234 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8cbf6223-eb9e-4c30-9be7-c289ced45992-etc-openvswitch\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.441241 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/141e70e5-9790-4051-898b-ed3910ef3927-registration-dir\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.441275 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovnkube-config\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.441283 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-multus-socket-dir-parent\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.442248 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovn-node-metrics-cert\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.442609 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/846d14a0-81fa-4701-9219-6c5631b28c34-agent-certs\") pod \"konnectivity-agent-wnxqf\" (UID: \"846d14a0-81fa-4701-9219-6c5631b28c34\") " pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.442762 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8cbf6223-eb9e-4c30-9be7-c289ced45992-ovnkube-script-lib\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.442808 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-etc-tuned\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.443160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.443029 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-tmp\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.449803 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.449745 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh74q\" (UniqueName: \"kubernetes.io/projected/dcdcfdc4-6f4b-4b64-a365-14e07ea949cf-kube-api-access-vh74q\") pod \"tuned-kv9j6\" (UID: \"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf\") " pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.449938 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.449883 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr48g\" (UniqueName: \"kubernetes.io/projected/1c7034ae-80c0-4dd1-9098-c92022ad516a-kube-api-access-jr48g\") pod \"iptables-alerter-jbf6q\" (UID: \"1c7034ae-80c0-4dd1-9098-c92022ad516a\") " pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.450726 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.450705 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68z7q\" (UniqueName: \"kubernetes.io/projected/8cbf6223-eb9e-4c30-9be7-c289ced45992-kube-api-access-68z7q\") pod \"ovnkube-node-46gfp\" (UID: \"8cbf6223-eb9e-4c30-9be7-c289ced45992\") " pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.450726 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.450713 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-525lt\" (UniqueName: \"kubernetes.io/projected/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-kube-api-access-525lt\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:16.450999 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.450979 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gllrj\" (UniqueName: \"kubernetes.io/projected/141e70e5-9790-4051-898b-ed3910ef3927-kube-api-access-gllrj\") pod \"aws-ebs-csi-driver-node-2xt79\" (UID: \"141e70e5-9790-4051-898b-ed3910ef3927\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.451679 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.451659 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn9f6\" (UniqueName: \"kubernetes.io/projected/978e6f95-f2d1-47d8-a94e-f79ad5c672d6-kube-api-access-pn9f6\") pod \"multus-pm94g\" (UID: \"978e6f95-f2d1-47d8-a94e-f79ad5c672d6\") " pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.536836 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.536802 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" Apr 20 19:08:16.544720 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.544694 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qp25w" Apr 20 19:08:16.553349 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.553325 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nd8bd" Apr 20 19:08:16.558057 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.558035 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:16.563650 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.563627 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:16.570202 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.570177 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" Apr 20 19:08:16.576848 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.576828 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" Apr 20 19:08:16.583609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.583589 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pm94g" Apr 20 19:08:16.591170 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.591151 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jbf6q" Apr 20 19:08:16.924958 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:16.924932 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod846d14a0_81fa_4701_9219_6c5631b28c34.slice/crio-dac72c310ad59f07b5f38caf224b66625c0454863b3f76a8788b81639a591a6f WatchSource:0}: Error finding container dac72c310ad59f07b5f38caf224b66625c0454863b3f76a8788b81639a591a6f: Status 404 returned error can't find the container with id dac72c310ad59f07b5f38caf224b66625c0454863b3f76a8788b81639a591a6f Apr 20 19:08:16.926503 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:16.926473 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cbf6223_eb9e_4c30_9be7_c289ced45992.slice/crio-5482a35fd7306bb4844c88ad095b35b87a3c3ff42df7d44e2150e6712c932e3e WatchSource:0}: Error finding container 5482a35fd7306bb4844c88ad095b35b87a3c3ff42df7d44e2150e6712c932e3e: Status 404 returned error can't find the container with id 5482a35fd7306bb4844c88ad095b35b87a3c3ff42df7d44e2150e6712c932e3e Apr 20 19:08:16.928658 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:16.927984 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcdcfdc4_6f4b_4b64_a365_14e07ea949cf.slice/crio-7d4458af9fc14ccc2c193641b5c0f1554173f5bb26a31c5de99e4d48c9da17ee WatchSource:0}: Error finding container 7d4458af9fc14ccc2c193641b5c0f1554173f5bb26a31c5de99e4d48c9da17ee: Status 404 returned error can't find the container with id 7d4458af9fc14ccc2c193641b5c0f1554173f5bb26a31c5de99e4d48c9da17ee Apr 20 19:08:16.931004 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:16.930981 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod978e6f95_f2d1_47d8_a94e_f79ad5c672d6.slice/crio-9da66f4b235a99f5ba7780fa3e27841cac4485b35af32ad85a0f9bd11de45fd8 WatchSource:0}: Error finding container 9da66f4b235a99f5ba7780fa3e27841cac4485b35af32ad85a0f9bd11de45fd8: Status 404 returned error can't find the container with id 9da66f4b235a99f5ba7780fa3e27841cac4485b35af32ad85a0f9bd11de45fd8 Apr 20 19:08:16.932597 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:16.932562 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb04f4c_6eed_4f33_ae86_5126283315de.slice/crio-2ed8fb45ea3593a9292cf6ad755b58c3c44dd65cb6a7e34d54a839da07d4bbf6 WatchSource:0}: Error finding container 2ed8fb45ea3593a9292cf6ad755b58c3c44dd65cb6a7e34d54a839da07d4bbf6: Status 404 returned error can't find the container with id 2ed8fb45ea3593a9292cf6ad755b58c3c44dd65cb6a7e34d54a839da07d4bbf6 Apr 20 19:08:16.938429 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:16.938405 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406ba3b2_33e1_4cc9_941c_55a06a114e38.slice/crio-5c3e23be40705e62446c58d12ca1922d05d09863d0a58e44a674dc01df8526ab WatchSource:0}: Error finding container 5c3e23be40705e62446c58d12ca1922d05d09863d0a58e44a674dc01df8526ab: Status 404 returned error can't find the container with id 5c3e23be40705e62446c58d12ca1922d05d09863d0a58e44a674dc01df8526ab Apr 20 19:08:16.939460 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:16.939436 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c7034ae_80c0_4dd1_9098_c92022ad516a.slice/crio-9abe4118ac52cad647f460a33ab1aa936fc060b38a706c3e8f8fe5ec20572786 WatchSource:0}: Error finding container 9abe4118ac52cad647f460a33ab1aa936fc060b38a706c3e8f8fe5ec20572786: Status 404 returned error can't find the container with id 9abe4118ac52cad647f460a33ab1aa936fc060b38a706c3e8f8fe5ec20572786 Apr 20 19:08:16.940571 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:16.940549 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141e70e5_9790_4051_898b_ed3910ef3927.slice/crio-4ad6463b8b4fea8dcdd19efba367dad5179c9b9009dc40b3c3fee33c7a40379e WatchSource:0}: Error finding container 4ad6463b8b4fea8dcdd19efba367dad5179c9b9009dc40b3c3fee33c7a40379e: Status 404 returned error can't find the container with id 4ad6463b8b4fea8dcdd19efba367dad5179c9b9009dc40b3c3fee33c7a40379e Apr 20 19:08:16.944029 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.943996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:16.944135 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:16.944042 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:16.944266 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.944164 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:16.944266 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.944212 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:17.944193901 +0000 UTC m=+4.176484386 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:16.944376 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.944349 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:16.944376 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.944366 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:16.944454 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.944378 2583 projected.go:194] Error preparing data for projected volume kube-api-access-frsq9 for pod openshift-network-diagnostics/network-check-target-xb2cz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:16.944454 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:16.944417 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9 podName:d79586bb-910e-443a-be9f-96dd10cd1d31 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:17.944404103 +0000 UTC m=+4.176694590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-frsq9" (UniqueName: "kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9") pod "network-check-target-xb2cz" (UID: "d79586bb-910e-443a-be9f-96dd10cd1d31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:17.277533 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.277201 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:03:15 +0000 UTC" deadline="2027-10-21 19:16:01.894426274 +0000 UTC" Apr 20 19:08:17.277533 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.277437 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13176h7m44.616994665s" Apr 20 19:08:17.357463 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.356924 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:17.357463 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:17.357043 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:17.371932 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.371866 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jbf6q" event={"ID":"1c7034ae-80c0-4dd1-9098-c92022ad516a","Type":"ContainerStarted","Data":"9abe4118ac52cad647f460a33ab1aa936fc060b38a706c3e8f8fe5ec20572786"} Apr 20 19:08:17.375257 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.375227 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" event={"ID":"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf","Type":"ContainerStarted","Data":"7d4458af9fc14ccc2c193641b5c0f1554173f5bb26a31c5de99e4d48c9da17ee"} Apr 20 19:08:17.376675 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.376649 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wnxqf" event={"ID":"846d14a0-81fa-4701-9219-6c5631b28c34","Type":"ContainerStarted","Data":"dac72c310ad59f07b5f38caf224b66625c0454863b3f76a8788b81639a591a6f"} Apr 20 19:08:17.386484 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.382117 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerStarted","Data":"2ed8fb45ea3593a9292cf6ad755b58c3c44dd65cb6a7e34d54a839da07d4bbf6"} Apr 20 19:08:17.387900 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.387830 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" event={"ID":"141e70e5-9790-4051-898b-ed3910ef3927","Type":"ContainerStarted","Data":"4ad6463b8b4fea8dcdd19efba367dad5179c9b9009dc40b3c3fee33c7a40379e"} Apr 20 19:08:17.393501 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.392243 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qp25w" event={"ID":"406ba3b2-33e1-4cc9-941c-55a06a114e38","Type":"ContainerStarted","Data":"5c3e23be40705e62446c58d12ca1922d05d09863d0a58e44a674dc01df8526ab"} Apr 20 19:08:17.414648 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.414508 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nd8bd" event={"ID":"8adeca3e-66a6-48a4-81e8-13898bdffa54","Type":"ContainerStarted","Data":"cba2b3db3822eb76da5cbb17e25d1678b369725431045d905568efc657243bc9"} Apr 20 19:08:17.418571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.418510 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pm94g" event={"ID":"978e6f95-f2d1-47d8-a94e-f79ad5c672d6","Type":"ContainerStarted","Data":"9da66f4b235a99f5ba7780fa3e27841cac4485b35af32ad85a0f9bd11de45fd8"} Apr 20 19:08:17.428427 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.428354 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"5482a35fd7306bb4844c88ad095b35b87a3c3ff42df7d44e2150e6712c932e3e"} Apr 20 19:08:17.438046 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.438007 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" event={"ID":"ec9ad27b137077b1d073e8b7bc68876b","Type":"ContainerStarted","Data":"fd6da18a43954b49b99fc503088eb69e06b0ce59c2b7761eb5d13efdff435b8b"} Apr 20 19:08:17.452988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.452499 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-126.ec2.internal" podStartSLOduration=2.4524807490000002 podStartE2EDuration="2.452480749s" podCreationTimestamp="2026-04-20 19:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:08:17.451595192 +0000 UTC m=+3.683885701" watchObservedRunningTime="2026-04-20 19:08:17.452480749 +0000 UTC m=+3.684771258" Apr 20 19:08:17.953280 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.952695 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:17.953280 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:17.952749 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:17.953280 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:17.952855 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:17.953280 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:17.952864 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:17.953280 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:17.952880 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:17.953280 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:17.952891 2583 projected.go:194] Error preparing data for projected volume kube-api-access-frsq9 for pod openshift-network-diagnostics/network-check-target-xb2cz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:17.953280 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:17.952922 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:19.952904525 +0000 UTC m=+6.185195013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:17.953280 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:17.952945 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9 podName:d79586bb-910e-443a-be9f-96dd10cd1d31 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:19.952934026 +0000 UTC m=+6.185224518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-frsq9" (UniqueName: "kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9") pod "network-check-target-xb2cz" (UID: "d79586bb-910e-443a-be9f-96dd10cd1d31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:18.359218 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:18.359136 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:18.359693 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:18.359282 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:18.447757 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:18.447719 2583 generic.go:358] "Generic (PLEG): container finished" podID="9ff307b2820fbf179d500677369dc91d" containerID="4558f1b4e0254fc93fbd2b79f26b7e3bfdd8c204b967457dd5a33017b6f5ac1b" exitCode=0 Apr 20 19:08:18.448870 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:18.448843 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" event={"ID":"9ff307b2820fbf179d500677369dc91d","Type":"ContainerDied","Data":"4558f1b4e0254fc93fbd2b79f26b7e3bfdd8c204b967457dd5a33017b6f5ac1b"} Apr 20 19:08:19.357597 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:19.357065 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:19.357597 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:19.357197 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:19.459095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:19.459054 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" event={"ID":"9ff307b2820fbf179d500677369dc91d","Type":"ContainerStarted","Data":"bf10d6d676293d3d3474825ba4c26ee798350c5826b13b9cf72abd07698094cf"} Apr 20 19:08:19.972581 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:19.971771 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:19.972581 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:19.971826 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:19.972581 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:19.971969 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:19.972581 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:19.972030 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:23.97201254 +0000 UTC m=+10.204303034 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:19.972581 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:19.972468 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:19.972581 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:19.972486 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:19.972581 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:19.972500 2583 projected.go:194] Error preparing data for projected volume kube-api-access-frsq9 for pod openshift-network-diagnostics/network-check-target-xb2cz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:19.972581 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:19.972545 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9 podName:d79586bb-910e-443a-be9f-96dd10cd1d31 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:23.972529035 +0000 UTC m=+10.204819524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-frsq9" (UniqueName: "kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9") pod "network-check-target-xb2cz" (UID: "d79586bb-910e-443a-be9f-96dd10cd1d31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:20.357130 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:20.357040 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:20.357289 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:20.357181 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:21.357413 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:21.357376 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:21.357760 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:21.357487 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:22.359407 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:22.359213 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:22.359407 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:22.359367 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:23.357432 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:23.357391 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:23.357615 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:23.357536 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:24.007980 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:24.007948 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:24.008570 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:24.007994 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:24.008570 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:24.008126 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:24.008570 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:24.008189 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:32.008170035 +0000 UTC m=+18.240460521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:24.008570 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:24.008440 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:24.008570 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:24.008457 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:24.008570 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:24.008470 2583 projected.go:194] Error preparing data for projected volume kube-api-access-frsq9 for pod openshift-network-diagnostics/network-check-target-xb2cz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:24.008570 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:24.008522 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9 podName:d79586bb-910e-443a-be9f-96dd10cd1d31 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:32.008508265 +0000 UTC m=+18.240798755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-frsq9" (UniqueName: "kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9") pod "network-check-target-xb2cz" (UID: "d79586bb-910e-443a-be9f-96dd10cd1d31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:24.359533 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:24.358530 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:24.359533 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:24.358662 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:25.357202 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:25.357168 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:25.357658 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:25.357287 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:26.357148 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:26.357111 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:26.357297 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:26.357228 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:27.357565 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:27.357533 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:27.358028 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:27.357648 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:28.356902 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:28.356861 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:28.357079 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:28.356991 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:29.357286 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:29.357247 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:29.357717 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:29.357379 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:30.357213 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:30.357181 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:30.357418 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:30.357343 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:31.357012 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:31.356977 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:31.357191 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:31.357106 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:32.068383 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:32.068343 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:32.068910 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:32.068421 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:32.068910 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:32.068534 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:32.068910 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:32.068548 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:32.068910 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:32.068571 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:32.068910 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:32.068583 2583 projected.go:194] Error preparing data for projected volume kube-api-access-frsq9 for pod openshift-network-diagnostics/network-check-target-xb2cz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:32.068910 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:32.068614 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:48.068592801 +0000 UTC m=+34.300883292 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:32.068910 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:32.068638 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9 podName:d79586bb-910e-443a-be9f-96dd10cd1d31 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:48.068622727 +0000 UTC m=+34.300913232 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-frsq9" (UniqueName: "kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9") pod "network-check-target-xb2cz" (UID: "d79586bb-910e-443a-be9f-96dd10cd1d31") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:32.357256 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:32.357172 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:32.357438 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:32.357329 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:33.356658 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:33.356626 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:33.357163 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:33.356723 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:34.359878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.359660 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:34.360360 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:34.360002 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:34.486157 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.486076 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" event={"ID":"dcdcfdc4-6f4b-4b64-a365-14e07ea949cf","Type":"ContainerStarted","Data":"e134539e6bae5881cc0215c78cc163bfa7305abf136dbba758af4be2948801f1"} Apr 20 19:08:34.487394 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.487364 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wnxqf" event={"ID":"846d14a0-81fa-4701-9219-6c5631b28c34","Type":"ContainerStarted","Data":"af138efc460c412ef8547ce21ade11f33988c8ba762df57307c8052284cd3786"} Apr 20 19:08:34.488521 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.488499 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerStarted","Data":"4a106c31907550e1919f32f5380e8f09dc0752e507b80f1fb130aa1a3a2e9555"} Apr 20 19:08:34.489804 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.489780 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" event={"ID":"141e70e5-9790-4051-898b-ed3910ef3927","Type":"ContainerStarted","Data":"98b71faa840ae3a41658d1283e5b5121b4a37b4c080f0bba7cddfc01d0490026"} Apr 20 19:08:34.491099 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.491076 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qp25w" event={"ID":"406ba3b2-33e1-4cc9-941c-55a06a114e38","Type":"ContainerStarted","Data":"14b554b75b7dbdd36c0598fa2cb5ce803d23dc9949036c21209d57060e15026b"} Apr 20 19:08:34.492506 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.492477 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nd8bd" event={"ID":"8adeca3e-66a6-48a4-81e8-13898bdffa54","Type":"ContainerStarted","Data":"ee8f408a4048b93ab90cbbdbc9e51f6ac413b212984f5b8414cfedb209f983e1"} Apr 20 19:08:34.494075 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.494047 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pm94g" event={"ID":"978e6f95-f2d1-47d8-a94e-f79ad5c672d6","Type":"ContainerStarted","Data":"2ee49db3fac8d003d2db4c333fd00024448540c34c46239850f7a959c2bbdac5"} Apr 20 19:08:34.495598 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.495566 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"44460d677d8fc0a79f4c37925ca03f8f9a556958058331c5f55ba415ec04582d"} Apr 20 19:08:34.513239 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.513191 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-126.ec2.internal" podStartSLOduration=19.513172799 podStartE2EDuration="19.513172799s" podCreationTimestamp="2026-04-20 19:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:08:19.47526621 +0000 UTC m=+5.707556720" watchObservedRunningTime="2026-04-20 19:08:34.513172799 +0000 UTC m=+20.745463308" Apr 20 19:08:34.530934 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.530887 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qp25w" podStartSLOduration=3.48828112 podStartE2EDuration="20.530869993s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.940400684 +0000 UTC m=+3.172691177" lastFinishedPulling="2026-04-20 19:08:33.98298955 +0000 UTC m=+20.215280050" observedRunningTime="2026-04-20 19:08:34.530442969 +0000 UTC m=+20.762733477" watchObservedRunningTime="2026-04-20 19:08:34.530869993 +0000 UTC m=+20.763160501" Apr 20 19:08:34.547573 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.547527 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nd8bd" podStartSLOduration=3.501988755 podStartE2EDuration="20.547512485s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.937937471 +0000 UTC m=+3.170227960" lastFinishedPulling="2026-04-20 19:08:33.9834612 +0000 UTC m=+20.215751690" observedRunningTime="2026-04-20 19:08:34.547328021 +0000 UTC m=+20.779618530" watchObservedRunningTime="2026-04-20 19:08:34.547512485 +0000 UTC m=+20.779802992" Apr 20 19:08:34.582903 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.582866 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wnxqf" podStartSLOduration=3.526760224 podStartE2EDuration="20.582849835s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.926970291 +0000 UTC m=+3.159260778" lastFinishedPulling="2026-04-20 19:08:33.983059888 +0000 UTC m=+20.215350389" observedRunningTime="2026-04-20 19:08:34.562654192 +0000 UTC m=+20.794944702" watchObservedRunningTime="2026-04-20 19:08:34.582849835 +0000 UTC m=+20.815140342" Apr 20 19:08:34.583274 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.583247 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kv9j6" podStartSLOduration=3.5280688700000002 podStartE2EDuration="20.583238235s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.929956504 +0000 UTC m=+3.162246989" lastFinishedPulling="2026-04-20 19:08:33.985125857 +0000 UTC m=+20.217416354" observedRunningTime="2026-04-20 19:08:34.582688884 +0000 UTC m=+20.814979391" watchObservedRunningTime="2026-04-20 19:08:34.583238235 +0000 UTC m=+20.815528743" Apr 20 19:08:34.602645 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:34.602601 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pm94g" podStartSLOduration=3.515871224 podStartE2EDuration="20.602584963s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.933117794 +0000 UTC m=+3.165408293" lastFinishedPulling="2026-04-20 19:08:34.019831544 +0000 UTC m=+20.252122032" observedRunningTime="2026-04-20 19:08:34.602166879 +0000 UTC m=+20.834457387" watchObservedRunningTime="2026-04-20 19:08:34.602584963 +0000 UTC m=+20.834875471" Apr 20 19:08:35.357538 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.357279 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:35.357733 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:35.357588 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:35.498020 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.497986 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jbf6q" event={"ID":"1c7034ae-80c0-4dd1-9098-c92022ad516a","Type":"ContainerStarted","Data":"5d15a6b3b41f06928d6abc75c0b9d482d90bfe48e378eb43da395521789eb603"} Apr 20 19:08:35.499230 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.499204 2583 generic.go:358] "Generic (PLEG): container finished" podID="0eb04f4c-6eed-4f33-ae86-5126283315de" containerID="4a106c31907550e1919f32f5380e8f09dc0752e507b80f1fb130aa1a3a2e9555" exitCode=0 Apr 20 19:08:35.499364 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.499293 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerDied","Data":"4a106c31907550e1919f32f5380e8f09dc0752e507b80f1fb130aa1a3a2e9555"} Apr 20 19:08:35.501612 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.501588 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"2566a57bae3434119c112a14fbcb7be27481a2172a60024e5dd490b076ef0154"} Apr 20 19:08:35.501713 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.501617 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"2c70adb9aef8a38875151602b827433ec7a9d44f789682c63ff92549e5c08375"} Apr 20 19:08:35.501713 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.501629 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"337beb87ebb20e5f9bffa7ebfc6420fa0ebadb377125547bd38251b50048e07f"} Apr 20 19:08:35.501713 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.501642 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"2c44bad18a2c92904bd47e1043c2dd211bcce1fd1cfe53bb4f613c74be0dc10e"} Apr 20 19:08:35.501713 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.501654 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"c12b77ee6262fbb6d8c64fcf2df5d8f8371ea4c2e52885e44b9c6881ceb5e2a7"} Apr 20 19:08:35.513749 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.513712 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jbf6q" podStartSLOduration=4.472271093 podStartE2EDuration="21.513699967s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.941798542 +0000 UTC m=+3.174089031" lastFinishedPulling="2026-04-20 19:08:33.983227419 +0000 UTC m=+20.215517905" observedRunningTime="2026-04-20 19:08:35.513359248 +0000 UTC m=+21.745649752" watchObservedRunningTime="2026-04-20 19:08:35.513699967 +0000 UTC m=+21.745990475" Apr 20 19:08:35.760675 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:35.760647 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:08:36.300957 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:36.300844 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:08:35.760667096Z","UUID":"7a5a97e4-008f-457c-bf54-ff16639f61be","Handler":null,"Name":"","Endpoint":""} Apr 20 19:08:36.304638 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:36.304610 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:08:36.304788 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:36.304645 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:08:36.358645 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:36.358619 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:36.358806 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:36.358716 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:36.504529 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:36.504490 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" event={"ID":"141e70e5-9790-4051-898b-ed3910ef3927","Type":"ContainerStarted","Data":"6131d7f2642160f07f6c940d65ebee52962ba306313859a80f93a9fdddcae419"} Apr 20 19:08:36.590552 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:36.590467 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:36.590719 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:36.590588 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:37.357692 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:37.357454 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:37.357857 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:37.357722 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:37.508907 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:37.508872 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" event={"ID":"141e70e5-9790-4051-898b-ed3910ef3927","Type":"ContainerStarted","Data":"049b27a0e36548a50f7d85ef583df6e79b1ef0601908fb81a8ef516dc42a4e24"} Apr 20 19:08:37.512445 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:37.512410 2583 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 19:08:37.512445 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:37.512424 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"4e7e8a699a0a854fe59582ceae07cc029326d093a70b64c54d0568a5d2cebb72"} Apr 20 19:08:37.529727 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:37.529680 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2xt79" podStartSLOduration=3.516829626 podStartE2EDuration="23.529662488s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.942515436 +0000 UTC m=+3.174805923" lastFinishedPulling="2026-04-20 19:08:36.955348287 +0000 UTC m=+23.187638785" observedRunningTime="2026-04-20 19:08:37.529268255 +0000 UTC m=+23.761558776" watchObservedRunningTime="2026-04-20 19:08:37.529662488 +0000 UTC m=+23.761952996" Apr 20 19:08:38.357327 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:38.357278 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:38.357537 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:38.357439 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:38.598503 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:38.598461 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:38.599156 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:38.598583 2583 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 19:08:38.599156 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:38.599011 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wnxqf" Apr 20 19:08:39.357325 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:39.357278 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:39.357500 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:39.357412 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:40.356858 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:40.356628 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:40.357386 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:40.356892 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:40.519452 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:40.519413 2583 generic.go:358] "Generic (PLEG): container finished" podID="0eb04f4c-6eed-4f33-ae86-5126283315de" containerID="ebc45176db6e40340a1b061d85d8a6be85167908ca4eef3dd27d826b9dfde207" exitCode=0 Apr 20 19:08:40.519626 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:40.519496 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerDied","Data":"ebc45176db6e40340a1b061d85d8a6be85167908ca4eef3dd27d826b9dfde207"} Apr 20 19:08:40.522490 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:40.522467 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" event={"ID":"8cbf6223-eb9e-4c30-9be7-c289ced45992","Type":"ContainerStarted","Data":"ef4fb11599d47b9acfd0f9dc260b108b05dcea63d823ddab2c2a303604d3c8db"} Apr 20 19:08:40.522781 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:40.522762 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:40.522875 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:40.522793 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:40.537983 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:40.537957 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:40.566654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:40.566607 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" podStartSLOduration=9.181398112 podStartE2EDuration="26.566592724s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.928747753 +0000 UTC m=+3.161038239" lastFinishedPulling="2026-04-20 19:08:34.313942359 +0000 UTC m=+20.546232851" observedRunningTime="2026-04-20 19:08:40.566330461 +0000 UTC m=+26.798620969" watchObservedRunningTime="2026-04-20 19:08:40.566592724 +0000 UTC m=+26.798883232" Apr 20 19:08:41.356859 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.356837 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:41.356988 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:41.356962 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:41.526356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.526300 2583 generic.go:358] "Generic (PLEG): container finished" podID="0eb04f4c-6eed-4f33-ae86-5126283315de" containerID="8c764b837f8fb184349c594c3a1dd72c40ae38004e16dbca47061b3b1bd45405" exitCode=0 Apr 20 19:08:41.526356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.526338 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerDied","Data":"8c764b837f8fb184349c594c3a1dd72c40ae38004e16dbca47061b3b1bd45405"} Apr 20 19:08:41.526925 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.526905 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:41.543002 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.542788 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:08:41.680211 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.680179 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xb2cz"] Apr 20 19:08:41.680364 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.680301 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:41.680422 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:41.680403 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:41.682237 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.682205 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9gbcz"] Apr 20 19:08:41.682385 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:41.682343 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:41.682471 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:41.682449 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:42.530715 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:42.530632 2583 generic.go:358] "Generic (PLEG): container finished" podID="0eb04f4c-6eed-4f33-ae86-5126283315de" containerID="5bbc22c1fe30c9f4f39798016fb7aa54dbdf0b6b81aeb42c7dd2f068b80c8c11" exitCode=0 Apr 20 19:08:42.531099 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:42.530720 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerDied","Data":"5bbc22c1fe30c9f4f39798016fb7aa54dbdf0b6b81aeb42c7dd2f068b80c8c11"} Apr 20 19:08:43.356794 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:43.356762 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:43.356924 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:43.356901 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:43.356924 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:43.356773 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:43.357037 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:43.357013 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:45.357108 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:45.357034 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:45.357747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:45.357031 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:45.357747 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:45.357155 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:08:45.357747 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:45.357218 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xb2cz" podUID="d79586bb-910e-443a-be9f-96dd10cd1d31" Apr 20 19:08:47.068706 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.068631 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-126.ec2.internal" event="NodeReady" Apr 20 19:08:47.069259 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.068809 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:08:47.127609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.127562 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tnhlk"] Apr 20 19:08:47.132687 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.132659 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.137224 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.137184 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:08:47.137429 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.137262 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:08:47.137635 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.137614 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ggf2q\"" Apr 20 19:08:47.145720 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.145003 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w5vbd"] Apr 20 19:08:47.149388 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.149366 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tnhlk"] Apr 20 19:08:47.149498 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.149483 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:47.152801 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.152630 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:08:47.154081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.154061 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w5vbd"] Apr 20 19:08:47.154327 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.154284 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:08:47.154539 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.154520 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gtxhr\"" Apr 20 19:08:47.154685 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.154667 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:08:47.283476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.283430 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8bvf\" (UniqueName: \"kubernetes.io/projected/b4d15e0b-c594-487d-9587-1c87df888ece-kube-api-access-q8bvf\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:47.283664 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.283499 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/34e2771f-67fc-4041-92d6-4479b21afc45-tmp-dir\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.283664 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.283528 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plv9w\" (UniqueName: \"kubernetes.io/projected/34e2771f-67fc-4041-92d6-4479b21afc45-kube-api-access-plv9w\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.283664 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.283560 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:47.283664 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.283619 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.283664 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.283655 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34e2771f-67fc-4041-92d6-4479b21afc45-config-volume\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.356928 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.356846 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:47.357195 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.357167 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:47.360223 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.360201 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wstmx\"" Apr 20 19:08:47.360357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.360289 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bdmvj\"" Apr 20 19:08:47.360407 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.360361 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:08:47.360448 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.360432 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:08:47.360562 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.360547 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:08:47.384327 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.384278 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34e2771f-67fc-4041-92d6-4479b21afc45-config-volume\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.384454 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.384367 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8bvf\" (UniqueName: \"kubernetes.io/projected/b4d15e0b-c594-487d-9587-1c87df888ece-kube-api-access-q8bvf\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:47.384454 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.384418 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/34e2771f-67fc-4041-92d6-4479b21afc45-tmp-dir\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.384454 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.384445 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plv9w\" (UniqueName: \"kubernetes.io/projected/34e2771f-67fc-4041-92d6-4479b21afc45-kube-api-access-plv9w\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.384606 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.384478 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:47.384606 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.384502 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.384727 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:47.384615 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:47.384727 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:47.384668 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls podName:34e2771f-67fc-4041-92d6-4479b21afc45 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:47.884651395 +0000 UTC m=+34.116941885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls") pod "dns-default-tnhlk" (UID: "34e2771f-67fc-4041-92d6-4479b21afc45") : secret "dns-default-metrics-tls" not found Apr 20 19:08:47.384833 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:47.384733 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:47.384833 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.384756 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34e2771f-67fc-4041-92d6-4479b21afc45-config-volume\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.384833 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:47.384770 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert podName:b4d15e0b-c594-487d-9587-1c87df888ece nodeName:}" failed. No retries permitted until 2026-04-20 19:08:47.884756526 +0000 UTC m=+34.117047016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert") pod "ingress-canary-w5vbd" (UID: "b4d15e0b-c594-487d-9587-1c87df888ece") : secret "canary-serving-cert" not found Apr 20 19:08:47.385024 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.384948 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/34e2771f-67fc-4041-92d6-4479b21afc45-tmp-dir\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.396352 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.396300 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plv9w\" (UniqueName: \"kubernetes.io/projected/34e2771f-67fc-4041-92d6-4479b21afc45-kube-api-access-plv9w\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.396705 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.396683 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8bvf\" (UniqueName: \"kubernetes.io/projected/b4d15e0b-c594-487d-9587-1c87df888ece-kube-api-access-q8bvf\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:47.887633 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.887596 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:47.887813 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:47.887645 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:47.887813 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:47.887758 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:47.887813 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:47.887776 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:47.887955 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:47.887832 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls podName:34e2771f-67fc-4041-92d6-4479b21afc45 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:48.887816079 +0000 UTC m=+35.120106565 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls") pod "dns-default-tnhlk" (UID: "34e2771f-67fc-4041-92d6-4479b21afc45") : secret "dns-default-metrics-tls" not found Apr 20 19:08:47.887955 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:47.887847 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert podName:b4d15e0b-c594-487d-9587-1c87df888ece nodeName:}" failed. No retries permitted until 2026-04-20 19:08:48.887840942 +0000 UTC m=+35.120131428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert") pod "ingress-canary-w5vbd" (UID: "b4d15e0b-c594-487d-9587-1c87df888ece") : secret "canary-serving-cert" not found Apr 20 19:08:48.089509 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:48.089473 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:08:48.089948 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:48.089568 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:48.089948 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:48.089655 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:08:48.089948 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:48.089734 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:20.089712918 +0000 UTC m=+66.322003420 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : secret "metrics-daemon-secret" not found Apr 20 19:08:48.092766 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:48.092741 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frsq9\" (UniqueName: \"kubernetes.io/projected/d79586bb-910e-443a-be9f-96dd10cd1d31-kube-api-access-frsq9\") pod \"network-check-target-xb2cz\" (UID: \"d79586bb-910e-443a-be9f-96dd10cd1d31\") " pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:48.269110 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:48.268890 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:48.427647 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:48.427613 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xb2cz"] Apr 20 19:08:48.465277 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:08:48.465245 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd79586bb_910e_443a_be9f_96dd10cd1d31.slice/crio-7f5ef5d1a06caf97e1647a62a549e3a250dd496067152c20c6f03291f6e3892c WatchSource:0}: Error finding container 7f5ef5d1a06caf97e1647a62a549e3a250dd496067152c20c6f03291f6e3892c: Status 404 returned error can't find the container with id 7f5ef5d1a06caf97e1647a62a549e3a250dd496067152c20c6f03291f6e3892c Apr 20 19:08:48.544523 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:48.544495 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xb2cz" event={"ID":"d79586bb-910e-443a-be9f-96dd10cd1d31","Type":"ContainerStarted","Data":"7f5ef5d1a06caf97e1647a62a549e3a250dd496067152c20c6f03291f6e3892c"} Apr 20 19:08:48.895255 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:48.895224 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:48.895484 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:48.895261 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:48.895484 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:48.895402 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:48.895484 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:48.895465 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert podName:b4d15e0b-c594-487d-9587-1c87df888ece nodeName:}" failed. No retries permitted until 2026-04-20 19:08:50.895445071 +0000 UTC m=+37.127735556 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert") pod "ingress-canary-w5vbd" (UID: "b4d15e0b-c594-487d-9587-1c87df888ece") : secret "canary-serving-cert" not found Apr 20 19:08:48.895484 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:48.895406 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:48.895690 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:48.895506 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls podName:34e2771f-67fc-4041-92d6-4479b21afc45 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:50.895494891 +0000 UTC m=+37.127785377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls") pod "dns-default-tnhlk" (UID: "34e2771f-67fc-4041-92d6-4479b21afc45") : secret "dns-default-metrics-tls" not found Apr 20 19:08:49.549473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:49.549437 2583 generic.go:358] "Generic (PLEG): container finished" podID="0eb04f4c-6eed-4f33-ae86-5126283315de" containerID="1ca3de76ea92c68f22ba5390c5e508e82de5e4fe7fa95629d04e2fc1a87e71f6" exitCode=0 Apr 20 19:08:49.549901 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:49.549492 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerDied","Data":"1ca3de76ea92c68f22ba5390c5e508e82de5e4fe7fa95629d04e2fc1a87e71f6"} Apr 20 19:08:50.554980 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:50.554942 2583 generic.go:358] "Generic (PLEG): container finished" podID="0eb04f4c-6eed-4f33-ae86-5126283315de" containerID="920a27e01238d8105f1aca14d2c1e640936418946bed74aae47137bd38d1ff40" exitCode=0 Apr 20 19:08:50.555432 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:50.555010 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerDied","Data":"920a27e01238d8105f1aca14d2c1e640936418946bed74aae47137bd38d1ff40"} Apr 20 19:08:50.912503 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:50.912460 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:50.912503 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:50.912503 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:50.912729 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:50.912636 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:50.912729 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:50.912716 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert podName:b4d15e0b-c594-487d-9587-1c87df888ece nodeName:}" failed. No retries permitted until 2026-04-20 19:08:54.912696116 +0000 UTC m=+41.144986606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert") pod "ingress-canary-w5vbd" (UID: "b4d15e0b-c594-487d-9587-1c87df888ece") : secret "canary-serving-cert" not found Apr 20 19:08:50.912854 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:50.912642 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:50.912854 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:50.912813 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls podName:34e2771f-67fc-4041-92d6-4479b21afc45 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:54.912782991 +0000 UTC m=+41.145073477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls") pod "dns-default-tnhlk" (UID: "34e2771f-67fc-4041-92d6-4479b21afc45") : secret "dns-default-metrics-tls" not found Apr 20 19:08:51.559807 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:51.559631 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" event={"ID":"0eb04f4c-6eed-4f33-ae86-5126283315de","Type":"ContainerStarted","Data":"ed024e528cc31181603283ddfd48bce12f3e90855b7d0fef89fa74c434e49a1e"} Apr 20 19:08:51.561030 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:51.561007 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xb2cz" event={"ID":"d79586bb-910e-443a-be9f-96dd10cd1d31","Type":"ContainerStarted","Data":"a42e3fa9716a70821a4e2e0bcc57c72cbfc51fa8ca111d767ae13a39e0eaaad1"} Apr 20 19:08:51.584689 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:51.584640 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m9kdg" podStartSLOduration=6.009551907 podStartE2EDuration="37.584624274s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:16.934792022 +0000 UTC m=+3.167082525" lastFinishedPulling="2026-04-20 19:08:48.509864403 +0000 UTC m=+34.742154892" observedRunningTime="2026-04-20 19:08:51.583064957 +0000 UTC m=+37.815355465" watchObservedRunningTime="2026-04-20 19:08:51.584624274 +0000 UTC m=+37.816914789" Apr 20 19:08:52.563126 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:52.563095 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:08:52.578825 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:52.578781 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xb2cz" podStartSLOduration=35.62053366 podStartE2EDuration="38.578766765s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:48.487348624 +0000 UTC m=+34.719639113" lastFinishedPulling="2026-04-20 19:08:51.445581715 +0000 UTC m=+37.677872218" observedRunningTime="2026-04-20 19:08:52.57871771 +0000 UTC m=+38.811008217" watchObservedRunningTime="2026-04-20 19:08:52.578766765 +0000 UTC m=+38.811057255" Apr 20 19:08:54.941424 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:54.941392 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:08:54.941424 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:08:54.941426 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:08:54.941826 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:54.941525 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:54.941826 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:54.941532 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:54.941826 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:54.941583 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls podName:34e2771f-67fc-4041-92d6-4479b21afc45 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:02.941570036 +0000 UTC m=+49.173860522 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls") pod "dns-default-tnhlk" (UID: "34e2771f-67fc-4041-92d6-4479b21afc45") : secret "dns-default-metrics-tls" not found Apr 20 19:08:54.941826 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:08:54.941597 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert podName:b4d15e0b-c594-487d-9587-1c87df888ece nodeName:}" failed. No retries permitted until 2026-04-20 19:09:02.941591115 +0000 UTC m=+49.173881600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert") pod "ingress-canary-w5vbd" (UID: "b4d15e0b-c594-487d-9587-1c87df888ece") : secret "canary-serving-cert" not found Apr 20 19:09:02.993840 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:02.993797 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:09:02.993840 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:02.993837 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:09:02.994372 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:02.993956 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:09:02.994372 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:02.993965 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:09:02.994372 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:02.994022 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls podName:34e2771f-67fc-4041-92d6-4479b21afc45 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:18.994007543 +0000 UTC m=+65.226298029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls") pod "dns-default-tnhlk" (UID: "34e2771f-67fc-4041-92d6-4479b21afc45") : secret "dns-default-metrics-tls" not found Apr 20 19:09:02.994372 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:02.994036 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert podName:b4d15e0b-c594-487d-9587-1c87df888ece nodeName:}" failed. No retries permitted until 2026-04-20 19:09:18.994030111 +0000 UTC m=+65.226320597 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert") pod "ingress-canary-w5vbd" (UID: "b4d15e0b-c594-487d-9587-1c87df888ece") : secret "canary-serving-cert" not found Apr 20 19:09:13.550896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:13.550870 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-46gfp" Apr 20 19:09:14.811539 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.811500 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb"] Apr 20 19:09:14.818269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.818238 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" Apr 20 19:09:14.822012 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.821988 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 19:09:14.822130 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.822016 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 19:09:14.822130 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.822050 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 19:09:14.822130 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.822022 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-8lbrs\"" Apr 20 19:09:14.822279 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.822145 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 19:09:14.824735 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.824509 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb"] Apr 20 19:09:14.878018 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.877984 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6b3ec00f-3b4c-4298-96da-d54bfa834f61-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb\" (UID: \"6b3ec00f-3b4c-4298-96da-d54bfa834f61\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" Apr 20 19:09:14.878018 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.878017 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf2ft\" (UniqueName: \"kubernetes.io/projected/6b3ec00f-3b4c-4298-96da-d54bfa834f61-kube-api-access-vf2ft\") pod \"managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb\" (UID: \"6b3ec00f-3b4c-4298-96da-d54bfa834f61\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" Apr 20 19:09:14.978383 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.978352 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6b3ec00f-3b4c-4298-96da-d54bfa834f61-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb\" (UID: \"6b3ec00f-3b4c-4298-96da-d54bfa834f61\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" Apr 20 19:09:14.978533 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.978391 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf2ft\" (UniqueName: \"kubernetes.io/projected/6b3ec00f-3b4c-4298-96da-d54bfa834f61-kube-api-access-vf2ft\") pod \"managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb\" (UID: \"6b3ec00f-3b4c-4298-96da-d54bfa834f61\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" Apr 20 19:09:14.982531 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.982507 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6b3ec00f-3b4c-4298-96da-d54bfa834f61-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb\" (UID: \"6b3ec00f-3b4c-4298-96da-d54bfa834f61\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" Apr 20 19:09:14.992128 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:14.992100 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf2ft\" (UniqueName: \"kubernetes.io/projected/6b3ec00f-3b4c-4298-96da-d54bfa834f61-kube-api-access-vf2ft\") pod \"managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb\" (UID: \"6b3ec00f-3b4c-4298-96da-d54bfa834f61\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" Apr 20 19:09:15.134041 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:15.134003 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" Apr 20 19:09:15.255804 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:15.255773 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb"] Apr 20 19:09:15.259083 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:09:15.259050 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b3ec00f_3b4c_4298_96da_d54bfa834f61.slice/crio-6afae02d4af1c3b52edb9ab185c5d7499fb35dcb854bab28d451d5909e244f77 WatchSource:0}: Error finding container 6afae02d4af1c3b52edb9ab185c5d7499fb35dcb854bab28d451d5909e244f77: Status 404 returned error can't find the container with id 6afae02d4af1c3b52edb9ab185c5d7499fb35dcb854bab28d451d5909e244f77 Apr 20 19:09:15.611815 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:15.611779 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" event={"ID":"6b3ec00f-3b4c-4298-96da-d54bfa834f61","Type":"ContainerStarted","Data":"6afae02d4af1c3b52edb9ab185c5d7499fb35dcb854bab28d451d5909e244f77"} Apr 20 19:09:18.619037 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:18.619003 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" event={"ID":"6b3ec00f-3b4c-4298-96da-d54bfa834f61","Type":"ContainerStarted","Data":"2b609ec24dbb44be62e71f97ade6d4f2b0d952be520b881f36ec75fc571ddca3"} Apr 20 19:09:18.633968 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:18.633925 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7bc4cf65bc-jnvsb" podStartSLOduration=1.490188917 podStartE2EDuration="4.633913472s" podCreationTimestamp="2026-04-20 19:09:14 +0000 UTC" firstStartedPulling="2026-04-20 19:09:15.260848195 +0000 UTC m=+61.493138681" lastFinishedPulling="2026-04-20 19:09:18.404572736 +0000 UTC m=+64.636863236" observedRunningTime="2026-04-20 19:09:18.633813247 +0000 UTC m=+64.866103765" watchObservedRunningTime="2026-04-20 19:09:18.633913472 +0000 UTC m=+64.866203980" Apr 20 19:09:19.003854 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:19.003810 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:09:19.003854 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:19.003860 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:09:19.004056 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:19.003944 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:09:19.004056 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:19.003965 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:09:19.004056 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:19.004002 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert podName:b4d15e0b-c594-487d-9587-1c87df888ece nodeName:}" failed. No retries permitted until 2026-04-20 19:09:51.003988359 +0000 UTC m=+97.236278845 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert") pod "ingress-canary-w5vbd" (UID: "b4d15e0b-c594-487d-9587-1c87df888ece") : secret "canary-serving-cert" not found Apr 20 19:09:19.004056 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:19.004016 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls podName:34e2771f-67fc-4041-92d6-4479b21afc45 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:51.004009918 +0000 UTC m=+97.236300404 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls") pod "dns-default-tnhlk" (UID: "34e2771f-67fc-4041-92d6-4479b21afc45") : secret "dns-default-metrics-tls" not found Apr 20 19:09:20.111126 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:20.111090 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:09:20.111522 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:20.111231 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:09:20.111522 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:20.111297 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:24.111281524 +0000 UTC m=+130.343572010 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : secret "metrics-daemon-secret" not found Apr 20 19:09:23.567998 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:23.567963 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xb2cz" Apr 20 19:09:51.028472 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:51.028428 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:09:51.028472 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:09:51.028472 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:09:51.029073 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:51.028588 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:09:51.029073 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:51.028615 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:09:51.029073 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:51.028665 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert podName:b4d15e0b-c594-487d-9587-1c87df888ece nodeName:}" failed. No retries permitted until 2026-04-20 19:10:55.028648514 +0000 UTC m=+161.260939005 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert") pod "ingress-canary-w5vbd" (UID: "b4d15e0b-c594-487d-9587-1c87df888ece") : secret "canary-serving-cert" not found Apr 20 19:09:51.029073 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:09:51.028680 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls podName:34e2771f-67fc-4041-92d6-4479b21afc45 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:55.028674261 +0000 UTC m=+161.260964747 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls") pod "dns-default-tnhlk" (UID: "34e2771f-67fc-4041-92d6-4479b21afc45") : secret "dns-default-metrics-tls" not found Apr 20 19:10:09.116299 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.116267 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7fb965fb56-9fs7v"] Apr 20 19:10:09.118406 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.118391 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.121106 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.121084 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 19:10:09.121284 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.121269 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 19:10:09.121536 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.121521 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 19:10:09.121620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.121554 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:10:09.122110 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.122091 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6cdnp\"" Apr 20 19:10:09.122193 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.122111 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:10:09.122193 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.122099 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 19:10:09.131299 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.131278 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7fb965fb56-9fs7v"] Apr 20 19:10:09.250288 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.250250 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-default-certificate\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.250288 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.250291 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-stats-auth\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.250554 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.250332 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.250554 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.250384 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.250554 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.250529 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2s7x\" (UniqueName: \"kubernetes.io/projected/aa77ff70-05c0-428a-9836-2f6d05ddecd7-kube-api-access-k2s7x\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.351650 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.351604 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-default-certificate\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.351650 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.351655 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-stats-auth\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.351872 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.351676 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.351872 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.351694 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.351872 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:09.351781 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:10:09.351872 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:09.351830 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:09.851810914 +0000 UTC m=+116.084101399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : configmap references non-existent config key: service-ca.crt Apr 20 19:10:09.351872 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:09.351858 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:09.851849878 +0000 UTC m=+116.084140367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : secret "router-metrics-certs-default" not found Apr 20 19:10:09.352122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.351898 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2s7x\" (UniqueName: \"kubernetes.io/projected/aa77ff70-05c0-428a-9836-2f6d05ddecd7-kube-api-access-k2s7x\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.354322 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.354282 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-stats-auth\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.354422 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.354340 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-default-certificate\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.362735 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.362710 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2s7x\" (UniqueName: \"kubernetes.io/projected/aa77ff70-05c0-428a-9836-2f6d05ddecd7-kube-api-access-k2s7x\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.856050 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.856017 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.856050 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:09.856051 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:09.856344 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:09.856154 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:10:09.856344 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:09.856220 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:10.856196087 +0000 UTC m=+117.088486573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : configmap references non-existent config key: service-ca.crt Apr 20 19:10:09.856344 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:09.856248 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:10.856239436 +0000 UTC m=+117.088529928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : secret "router-metrics-certs-default" not found Apr 20 19:10:10.861286 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:10.861254 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:10.861286 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:10.861293 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:10.861773 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:10.861443 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:12.861421949 +0000 UTC m=+119.093712438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : configmap references non-existent config key: service-ca.crt Apr 20 19:10:10.861773 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:10.861473 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:10:10.861773 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:10.861525 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:12.861512557 +0000 UTC m=+119.093803043 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : secret "router-metrics-certs-default" not found Apr 20 19:10:12.876894 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:12.876841 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:12.876894 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:12.876895 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:12.877403 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:12.876998 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:10:12.877403 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:12.877012 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:16.876993709 +0000 UTC m=+123.109284195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : configmap references non-existent config key: service-ca.crt Apr 20 19:10:12.877403 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:12.877065 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:16.877046522 +0000 UTC m=+123.109337010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : secret "router-metrics-certs-default" not found Apr 20 19:10:14.449533 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:14.449503 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qp25w_406ba3b2-33e1-4cc9-941c-55a06a114e38/dns-node-resolver/0.log" Apr 20 19:10:15.249523 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:15.249496 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nd8bd_8adeca3e-66a6-48a4-81e8-13898bdffa54/node-ca/0.log" Apr 20 19:10:16.905920 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:16.905882 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:16.905920 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:16.905919 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:16.906377 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:16.906040 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:10:16.906377 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:16.906049 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:24.906030421 +0000 UTC m=+131.138320908 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : configmap references non-existent config key: service-ca.crt Apr 20 19:10:16.906377 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:16.906095 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:24.906079345 +0000 UTC m=+131.138369831 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : secret "router-metrics-certs-default" not found Apr 20 19:10:17.985558 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.985527 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mlqft"] Apr 20 19:10:17.987467 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.987452 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:17.990480 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.990451 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 19:10:17.992221 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.992195 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 19:10:17.992382 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.992203 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 19:10:17.992382 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.992260 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qxxwr\"" Apr 20 19:10:17.992382 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.992206 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:10:17.996301 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.996282 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 19:10:17.998846 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:17.998827 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mlqft"] Apr 20 19:10:18.115834 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.115798 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbnw6\" (UniqueName: \"kubernetes.io/projected/9befbc1e-c590-441a-af58-08f4e7b6d9a4-kube-api-access-bbnw6\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.116021 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.115887 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9befbc1e-c590-441a-af58-08f4e7b6d9a4-config\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.116021 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.115908 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9befbc1e-c590-441a-af58-08f4e7b6d9a4-trusted-ca\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.116021 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.115925 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9befbc1e-c590-441a-af58-08f4e7b6d9a4-serving-cert\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.216647 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.216621 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9befbc1e-c590-441a-af58-08f4e7b6d9a4-config\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.216740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.216654 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9befbc1e-c590-441a-af58-08f4e7b6d9a4-trusted-ca\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.216740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.216685 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9befbc1e-c590-441a-af58-08f4e7b6d9a4-serving-cert\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.216815 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.216772 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbnw6\" (UniqueName: \"kubernetes.io/projected/9befbc1e-c590-441a-af58-08f4e7b6d9a4-kube-api-access-bbnw6\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.217436 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.217404 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9befbc1e-c590-441a-af58-08f4e7b6d9a4-config\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.217560 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.217482 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9befbc1e-c590-441a-af58-08f4e7b6d9a4-trusted-ca\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.219148 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.219122 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9befbc1e-c590-441a-af58-08f4e7b6d9a4-serving-cert\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.226417 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.226392 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbnw6\" (UniqueName: \"kubernetes.io/projected/9befbc1e-c590-441a-af58-08f4e7b6d9a4-kube-api-access-bbnw6\") pod \"console-operator-9d4b6777b-mlqft\" (UID: \"9befbc1e-c590-441a-af58-08f4e7b6d9a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.297695 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.297611 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:18.413333 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.413285 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mlqft"] Apr 20 19:10:18.418288 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:18.418258 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9befbc1e_c590_441a_af58_08f4e7b6d9a4.slice/crio-e7a8746b8f7c9c1306b40856d26236475a7ca276bd966b644d086ef26f650bca WatchSource:0}: Error finding container e7a8746b8f7c9c1306b40856d26236475a7ca276bd966b644d086ef26f650bca: Status 404 returned error can't find the container with id e7a8746b8f7c9c1306b40856d26236475a7ca276bd966b644d086ef26f650bca Apr 20 19:10:18.733585 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:18.733556 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" event={"ID":"9befbc1e-c590-441a-af58-08f4e7b6d9a4","Type":"ContainerStarted","Data":"e7a8746b8f7c9c1306b40856d26236475a7ca276bd966b644d086ef26f650bca"} Apr 20 19:10:19.006096 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.006003 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm"] Apr 20 19:10:19.008695 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.008674 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.011190 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.011169 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 19:10:19.011349 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.011330 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 19:10:19.011402 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.011344 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-75hsn\"" Apr 20 19:10:19.011402 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.011381 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:10:19.012301 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.012281 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 19:10:19.019540 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.019519 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm"] Apr 20 19:10:19.110809 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.110775 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s"] Apr 20 19:10:19.112940 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.112919 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.115860 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.115584 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 19:10:19.115860 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.115608 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 19:10:19.115860 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.115695 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 19:10:19.115860 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.115794 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:10:19.116072 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.115873 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ms7kv\"" Apr 20 19:10:19.123299 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.122877 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.123299 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.122949 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.123299 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.122989 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq7w6\" (UniqueName: \"kubernetes.io/projected/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-kube-api-access-dq7w6\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.124158 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.124132 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s"] Apr 20 19:10:19.224053 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.224010 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.224244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.224069 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-config\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.224244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.224099 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.224244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.224121 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbc65\" (UniqueName: \"kubernetes.io/projected/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-kube-api-access-kbc65\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.224244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.224146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq7w6\" (UniqueName: \"kubernetes.io/projected/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-kube-api-access-dq7w6\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.224244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.224170 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.224819 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.224797 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.226800 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.226779 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.233220 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.233198 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq7w6\" (UniqueName: \"kubernetes.io/projected/7df06148-0d5a-4e8d-ba9d-6f7e64734e95-kube-api-access-dq7w6\") pod \"kube-storage-version-migrator-operator-6769c5d45-b5mfm\" (UID: \"7df06148-0d5a-4e8d-ba9d-6f7e64734e95\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.317499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.317419 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" Apr 20 19:10:19.325206 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.325181 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbc65\" (UniqueName: \"kubernetes.io/projected/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-kube-api-access-kbc65\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.325350 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.325227 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.325409 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.325379 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-config\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.325896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.325877 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-config\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.327690 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.327668 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.334053 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.334035 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbc65\" (UniqueName: \"kubernetes.io/projected/39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6-kube-api-access-kbc65\") pod \"service-ca-operator-d6fc45fc5-g445s\" (UID: \"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.423660 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.423630 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" Apr 20 19:10:19.437456 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.437424 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm"] Apr 20 19:10:19.441708 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:19.441681 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7df06148_0d5a_4e8d_ba9d_6f7e64734e95.slice/crio-190c4879a52c2f62e311dad0ae3d05eb6a28a7d9131e70c15c649445a09b1ab1 WatchSource:0}: Error finding container 190c4879a52c2f62e311dad0ae3d05eb6a28a7d9131e70c15c649445a09b1ab1: Status 404 returned error can't find the container with id 190c4879a52c2f62e311dad0ae3d05eb6a28a7d9131e70c15c649445a09b1ab1 Apr 20 19:10:19.551152 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.551120 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s"] Apr 20 19:10:19.554715 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:19.554685 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c5c9ae_a244_4bc1_a26c_70bf6a41b8e6.slice/crio-3fce870e076f82d1d58b779c9f3bfb31887fee5bc48e0a10f992c1a6c5133a46 WatchSource:0}: Error finding container 3fce870e076f82d1d58b779c9f3bfb31887fee5bc48e0a10f992c1a6c5133a46: Status 404 returned error can't find the container with id 3fce870e076f82d1d58b779c9f3bfb31887fee5bc48e0a10f992c1a6c5133a46 Apr 20 19:10:19.737043 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.736997 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" event={"ID":"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6","Type":"ContainerStarted","Data":"3fce870e076f82d1d58b779c9f3bfb31887fee5bc48e0a10f992c1a6c5133a46"} Apr 20 19:10:19.738078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:19.738047 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" event={"ID":"7df06148-0d5a-4e8d-ba9d-6f7e64734e95","Type":"ContainerStarted","Data":"190c4879a52c2f62e311dad0ae3d05eb6a28a7d9131e70c15c649445a09b1ab1"} Apr 20 19:10:20.742524 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:20.742436 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" event={"ID":"9befbc1e-c590-441a-af58-08f4e7b6d9a4","Type":"ContainerStarted","Data":"b3b23e17fade35279bd2fc3bb69a7aac0733cde0541f75d65989ad0a567ad6ad"} Apr 20 19:10:20.743151 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:20.742645 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:20.744643 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:20.744542 2583 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-mlqft container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.9:8443/readyz\": dial tcp 10.134.0.9:8443: connect: connection refused" start-of-body= Apr 20 19:10:20.744755 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:20.744683 2583 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" podUID="9befbc1e-c590-441a-af58-08f4e7b6d9a4" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.9:8443/readyz\": dial tcp 10.134.0.9:8443: connect: connection refused" Apr 20 19:10:20.759590 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:20.759536 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" podStartSLOduration=1.546394801 podStartE2EDuration="3.759519646s" podCreationTimestamp="2026-04-20 19:10:17 +0000 UTC" firstStartedPulling="2026-04-20 19:10:18.420036003 +0000 UTC m=+124.652326489" lastFinishedPulling="2026-04-20 19:10:20.633160848 +0000 UTC m=+126.865451334" observedRunningTime="2026-04-20 19:10:20.759154354 +0000 UTC m=+126.991444862" watchObservedRunningTime="2026-04-20 19:10:20.759519646 +0000 UTC m=+126.991810157" Apr 20 19:10:21.745715 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:21.745689 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/0.log" Apr 20 19:10:21.746214 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:21.745729 2583 generic.go:358] "Generic (PLEG): container finished" podID="9befbc1e-c590-441a-af58-08f4e7b6d9a4" containerID="b3b23e17fade35279bd2fc3bb69a7aac0733cde0541f75d65989ad0a567ad6ad" exitCode=255 Apr 20 19:10:21.746214 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:21.745781 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" event={"ID":"9befbc1e-c590-441a-af58-08f4e7b6d9a4","Type":"ContainerDied","Data":"b3b23e17fade35279bd2fc3bb69a7aac0733cde0541f75d65989ad0a567ad6ad"} Apr 20 19:10:21.746214 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:21.746086 2583 scope.go:117] "RemoveContainer" containerID="b3b23e17fade35279bd2fc3bb69a7aac0733cde0541f75d65989ad0a567ad6ad" Apr 20 19:10:22.749911 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.749881 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:10:22.750298 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.750262 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/0.log" Apr 20 19:10:22.750367 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.750297 2583 generic.go:358] "Generic (PLEG): container finished" podID="9befbc1e-c590-441a-af58-08f4e7b6d9a4" containerID="9f05a4634483575b60c0700e4837cb20b2f9403581c6ad75cf0d03ca324b29c7" exitCode=255 Apr 20 19:10:22.750367 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.750345 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" event={"ID":"9befbc1e-c590-441a-af58-08f4e7b6d9a4","Type":"ContainerDied","Data":"9f05a4634483575b60c0700e4837cb20b2f9403581c6ad75cf0d03ca324b29c7"} Apr 20 19:10:22.750430 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.750394 2583 scope.go:117] "RemoveContainer" containerID="b3b23e17fade35279bd2fc3bb69a7aac0733cde0541f75d65989ad0a567ad6ad" Apr 20 19:10:22.750684 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.750663 2583 scope.go:117] "RemoveContainer" containerID="9f05a4634483575b60c0700e4837cb20b2f9403581c6ad75cf0d03ca324b29c7" Apr 20 19:10:22.750904 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:22.750881 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mlqft_openshift-console-operator(9befbc1e-c590-441a-af58-08f4e7b6d9a4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" podUID="9befbc1e-c590-441a-af58-08f4e7b6d9a4" Apr 20 19:10:22.751741 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.751721 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" event={"ID":"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6","Type":"ContainerStarted","Data":"8435afa0cb2d5124c1178c401196b9a50585f8e91567a549cc0ec653ca933c9f"} Apr 20 19:10:22.753005 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.752987 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" event={"ID":"7df06148-0d5a-4e8d-ba9d-6f7e64734e95","Type":"ContainerStarted","Data":"004aa0dafd544051b603a5f8d425adefdc57bfa751eed118181512f8dfe9db0b"} Apr 20 19:10:22.782872 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.782824 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" podStartSLOduration=2.343040072 podStartE2EDuration="4.78281146s" podCreationTimestamp="2026-04-20 19:10:18 +0000 UTC" firstStartedPulling="2026-04-20 19:10:19.444728759 +0000 UTC m=+125.677019244" lastFinishedPulling="2026-04-20 19:10:21.884500143 +0000 UTC m=+128.116790632" observedRunningTime="2026-04-20 19:10:22.782120767 +0000 UTC m=+129.014411269" watchObservedRunningTime="2026-04-20 19:10:22.78281146 +0000 UTC m=+129.015101968" Apr 20 19:10:22.808141 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:22.808094 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" podStartSLOduration=1.478747485 podStartE2EDuration="3.808077552s" podCreationTimestamp="2026-04-20 19:10:19 +0000 UTC" firstStartedPulling="2026-04-20 19:10:19.556889404 +0000 UTC m=+125.789179897" lastFinishedPulling="2026-04-20 19:10:21.886219473 +0000 UTC m=+128.118509964" observedRunningTime="2026-04-20 19:10:22.806913352 +0000 UTC m=+129.039203859" watchObservedRunningTime="2026-04-20 19:10:22.808077552 +0000 UTC m=+129.040368059" Apr 20 19:10:23.732913 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.732880 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj"] Apr 20 19:10:23.735362 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.735343 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj" Apr 20 19:10:23.738207 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.738172 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-rwdr2\"" Apr 20 19:10:23.743354 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.743319 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj"] Apr 20 19:10:23.756350 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.756329 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:10:23.756746 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.756728 2583 scope.go:117] "RemoveContainer" containerID="9f05a4634483575b60c0700e4837cb20b2f9403581c6ad75cf0d03ca324b29c7" Apr 20 19:10:23.756889 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:23.756872 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mlqft_openshift-console-operator(9befbc1e-c590-441a-af58-08f4e7b6d9a4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" podUID="9befbc1e-c590-441a-af58-08f4e7b6d9a4" Apr 20 19:10:23.861965 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.861916 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58rqj\" (UniqueName: \"kubernetes.io/projected/efd97a6a-3f36-4480-9d4f-2e8113c15af9-kube-api-access-58rqj\") pod \"network-check-source-8894fc9bd-6tvvj\" (UID: \"efd97a6a-3f36-4480-9d4f-2e8113c15af9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj" Apr 20 19:10:23.955742 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.955702 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cff89fcd4-rcs2t"] Apr 20 19:10:23.957751 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.957734 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:23.960699 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.960678 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w68rf\"" Apr 20 19:10:23.960962 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.960951 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 19:10:23.961027 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.960983 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 19:10:23.961243 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.961228 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 19:10:23.962572 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.962554 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58rqj\" (UniqueName: \"kubernetes.io/projected/efd97a6a-3f36-4480-9d4f-2e8113c15af9-kube-api-access-58rqj\") pod \"network-check-source-8894fc9bd-6tvvj\" (UID: \"efd97a6a-3f36-4480-9d4f-2e8113c15af9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj" Apr 20 19:10:23.965958 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.965940 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 19:10:23.969477 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.969456 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cff89fcd4-rcs2t"] Apr 20 19:10:23.973679 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:23.973661 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58rqj\" (UniqueName: \"kubernetes.io/projected/efd97a6a-3f36-4480-9d4f-2e8113c15af9-kube-api-access-58rqj\") pod \"network-check-source-8894fc9bd-6tvvj\" (UID: \"efd97a6a-3f36-4480-9d4f-2e8113c15af9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj" Apr 20 19:10:24.048764 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.048694 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj" Apr 20 19:10:24.063738 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.063702 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/adb2dcdb-56f6-44a1-8932-aebb49b373e0-ca-trust-extracted\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.063849 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.063777 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-installation-pull-secrets\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.063849 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.063834 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvll7\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-kube-api-access-cvll7\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.063956 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.063852 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-image-registry-private-configuration\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.063956 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.063870 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-certificates\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.063956 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.063886 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-trusted-ca\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.063956 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.063932 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-bound-sa-token\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.064099 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.063970 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165041 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165008 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:10:24.165199 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165052 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvll7\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-kube-api-access-cvll7\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165199 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165072 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-image-registry-private-configuration\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165199 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.165153 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:10:24.165410 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165202 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-certificates\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165410 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.165216 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs podName:45a7c0b2-25d6-499d-9c36-ca4ace9c7813 nodeName:}" failed. No retries permitted until 2026-04-20 19:12:26.165198884 +0000 UTC m=+252.397489373 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs") pod "network-metrics-daemon-9gbcz" (UID: "45a7c0b2-25d6-499d-9c36-ca4ace9c7813") : secret "metrics-daemon-secret" not found Apr 20 19:10:24.165410 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165250 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-trusted-ca\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165410 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165277 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-bound-sa-token\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165410 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165321 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165410 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165400 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/adb2dcdb-56f6-44a1-8932-aebb49b373e0-ca-trust-extracted\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165697 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.165511 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:10:24.165697 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.165535 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cff89fcd4-rcs2t: secret "image-registry-tls" not found Apr 20 19:10:24.165697 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.165626 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls podName:adb2dcdb-56f6-44a1-8932-aebb49b373e0 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:24.665573363 +0000 UTC m=+130.897863851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls") pod "image-registry-5cff89fcd4-rcs2t" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0") : secret "image-registry-tls" not found Apr 20 19:10:24.165697 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165670 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-installation-pull-secrets\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.165868 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165754 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/adb2dcdb-56f6-44a1-8932-aebb49b373e0-ca-trust-extracted\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.166030 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.165995 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-certificates\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.166177 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.166148 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-trusted-ca\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.167638 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.167612 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj"] Apr 20 19:10:24.167936 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.167919 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-image-registry-private-configuration\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.168543 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.168528 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-installation-pull-secrets\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.170682 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:24.170660 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd97a6a_3f36_4480_9d4f_2e8113c15af9.slice/crio-4b305a225c25c6473855d4cb3927095a208e448673f51c9e3f61fa9825c317e4 WatchSource:0}: Error finding container 4b305a225c25c6473855d4cb3927095a208e448673f51c9e3f61fa9825c317e4: Status 404 returned error can't find the container with id 4b305a225c25c6473855d4cb3927095a208e448673f51c9e3f61fa9825c317e4 Apr 20 19:10:24.174644 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.174621 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-bound-sa-token\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.174644 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.174637 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvll7\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-kube-api-access-cvll7\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.670705 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.670667 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:24.670913 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.670812 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:10:24.670913 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.670830 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cff89fcd4-rcs2t: secret "image-registry-tls" not found Apr 20 19:10:24.670913 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.670883 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls podName:adb2dcdb-56f6-44a1-8932-aebb49b373e0 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:25.670868631 +0000 UTC m=+131.903159117 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls") pod "image-registry-5cff89fcd4-rcs2t" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0") : secret "image-registry-tls" not found Apr 20 19:10:24.763871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.763832 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj" event={"ID":"efd97a6a-3f36-4480-9d4f-2e8113c15af9","Type":"ContainerStarted","Data":"0ab1a5f00933542f40b9892a2619a4cf8e3596d158121abab52049eb18e05756"} Apr 20 19:10:24.763871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.763873 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj" event={"ID":"efd97a6a-3f36-4480-9d4f-2e8113c15af9","Type":"ContainerStarted","Data":"4b305a225c25c6473855d4cb3927095a208e448673f51c9e3f61fa9825c317e4"} Apr 20 19:10:24.973756 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.973675 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:24.973900 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:24.973764 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:24.973900 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.973798 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:10:24.973900 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.973862 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:40.973841939 +0000 UTC m=+147.206132436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : secret "router-metrics-certs-default" not found Apr 20 19:10:24.974000 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:24.973967 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle podName:aa77ff70-05c0-428a-9836-2f6d05ddecd7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:40.973946801 +0000 UTC m=+147.206237290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle") pod "router-default-7fb965fb56-9fs7v" (UID: "aa77ff70-05c0-428a-9836-2f6d05ddecd7") : configmap references non-existent config key: service-ca.crt Apr 20 19:10:25.679704 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.679665 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:25.679875 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:25.679837 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:10:25.679875 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:25.679860 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cff89fcd4-rcs2t: secret "image-registry-tls" not found Apr 20 19:10:25.679949 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:25.679927 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls podName:adb2dcdb-56f6-44a1-8932-aebb49b373e0 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:27.679908212 +0000 UTC m=+133.912198702 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls") pod "image-registry-5cff89fcd4-rcs2t" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0") : secret "image-registry-tls" not found Apr 20 19:10:25.726529 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.726476 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6tvvj" podStartSLOduration=2.726456865 podStartE2EDuration="2.726456865s" podCreationTimestamp="2026-04-20 19:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:10:24.779128148 +0000 UTC m=+131.011418657" watchObservedRunningTime="2026-04-20 19:10:25.726456865 +0000 UTC m=+131.958747372" Apr 20 19:10:25.727298 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.727281 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vrlc6"] Apr 20 19:10:25.729380 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.729357 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.731920 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.731891 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:10:25.732166 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.732147 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:10:25.732608 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.732592 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:10:25.733121 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.733104 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:10:25.733188 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.733111 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qwcxh\"" Apr 20 19:10:25.739832 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.739807 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vrlc6"] Apr 20 19:10:25.881327 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.881259 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/435b37d5-2251-4632-a312-07617c9af5af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.881735 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.881339 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng2m9\" (UniqueName: \"kubernetes.io/projected/435b37d5-2251-4632-a312-07617c9af5af-kube-api-access-ng2m9\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.881735 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.881410 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/435b37d5-2251-4632-a312-07617c9af5af-crio-socket\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.881735 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.881434 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/435b37d5-2251-4632-a312-07617c9af5af-data-volume\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.881735 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.881519 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.982485 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.982399 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.982638 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.982505 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/435b37d5-2251-4632-a312-07617c9af5af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.982638 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.982525 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng2m9\" (UniqueName: \"kubernetes.io/projected/435b37d5-2251-4632-a312-07617c9af5af-kube-api-access-ng2m9\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.982638 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.982552 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/435b37d5-2251-4632-a312-07617c9af5af-crio-socket\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.982638 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.982568 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/435b37d5-2251-4632-a312-07617c9af5af-data-volume\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.982638 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:25.982585 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:25.982872 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:25.982669 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls podName:435b37d5-2251-4632-a312-07617c9af5af nodeName:}" failed. No retries permitted until 2026-04-20 19:10:26.482647596 +0000 UTC m=+132.714938084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vrlc6" (UID: "435b37d5-2251-4632-a312-07617c9af5af") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:25.982872 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.982674 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/435b37d5-2251-4632-a312-07617c9af5af-crio-socket\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.982872 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.982854 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/435b37d5-2251-4632-a312-07617c9af5af-data-volume\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.983185 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.983163 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/435b37d5-2251-4632-a312-07617c9af5af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:25.991414 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:25.991393 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng2m9\" (UniqueName: \"kubernetes.io/projected/435b37d5-2251-4632-a312-07617c9af5af-kube-api-access-ng2m9\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:26.487286 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:26.487252 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:26.487525 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:26.487415 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:26.487525 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:26.487481 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls podName:435b37d5-2251-4632-a312-07617c9af5af nodeName:}" failed. No retries permitted until 2026-04-20 19:10:27.487464714 +0000 UTC m=+133.719755199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vrlc6" (UID: "435b37d5-2251-4632-a312-07617c9af5af") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:27.495810 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:27.495769 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:27.496171 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:27.495927 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:27.496171 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:27.495993 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls podName:435b37d5-2251-4632-a312-07617c9af5af nodeName:}" failed. No retries permitted until 2026-04-20 19:10:29.49597542 +0000 UTC m=+135.728265906 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vrlc6" (UID: "435b37d5-2251-4632-a312-07617c9af5af") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:27.697147 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:27.697110 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:27.697293 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:27.697261 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:10:27.697293 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:27.697277 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cff89fcd4-rcs2t: secret "image-registry-tls" not found Apr 20 19:10:27.697395 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:27.697362 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls podName:adb2dcdb-56f6-44a1-8932-aebb49b373e0 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:31.697343717 +0000 UTC m=+137.929634207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls") pod "image-registry-5cff89fcd4-rcs2t" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0") : secret "image-registry-tls" not found Apr 20 19:10:28.298283 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:28.298248 2583 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:28.298652 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:28.298638 2583 scope.go:117] "RemoveContainer" containerID="9f05a4634483575b60c0700e4837cb20b2f9403581c6ad75cf0d03ca324b29c7" Apr 20 19:10:28.298819 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:28.298803 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mlqft_openshift-console-operator(9befbc1e-c590-441a-af58-08f4e7b6d9a4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" podUID="9befbc1e-c590-441a-af58-08f4e7b6d9a4" Apr 20 19:10:29.513167 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:29.513128 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:29.513579 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:29.513277 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:29.513579 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:29.513361 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls podName:435b37d5-2251-4632-a312-07617c9af5af nodeName:}" failed. No retries permitted until 2026-04-20 19:10:33.513346095 +0000 UTC m=+139.745636580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vrlc6" (UID: "435b37d5-2251-4632-a312-07617c9af5af") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:30.742960 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:30.742921 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:30.743340 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:30.743295 2583 scope.go:117] "RemoveContainer" containerID="9f05a4634483575b60c0700e4837cb20b2f9403581c6ad75cf0d03ca324b29c7" Apr 20 19:10:30.743500 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:30.743481 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mlqft_openshift-console-operator(9befbc1e-c590-441a-af58-08f4e7b6d9a4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" podUID="9befbc1e-c590-441a-af58-08f4e7b6d9a4" Apr 20 19:10:31.731368 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:31.731301 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:31.731525 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:31.731453 2583 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:10:31.731525 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:31.731474 2583 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cff89fcd4-rcs2t: secret "image-registry-tls" not found Apr 20 19:10:31.731618 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:31.731530 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls podName:adb2dcdb-56f6-44a1-8932-aebb49b373e0 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:39.731514965 +0000 UTC m=+145.963805457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls") pod "image-registry-5cff89fcd4-rcs2t" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0") : secret "image-registry-tls" not found Apr 20 19:10:33.545989 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:33.545949 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:33.546396 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:33.546063 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:33.546396 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:33.546125 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls podName:435b37d5-2251-4632-a312-07617c9af5af nodeName:}" failed. No retries permitted until 2026-04-20 19:10:41.54610981 +0000 UTC m=+147.778400295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls") pod "insights-runtime-extractor-vrlc6" (UID: "435b37d5-2251-4632-a312-07617c9af5af") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:39.798476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:39.798440 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:39.801067 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:39.801040 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"image-registry-5cff89fcd4-rcs2t\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:39.866688 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:39.866657 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:39.988420 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:39.988291 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cff89fcd4-rcs2t"] Apr 20 19:10:39.990897 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:39.990859 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadb2dcdb_56f6_44a1_8932_aebb49b373e0.slice/crio-5cf1b8a279fe1a6f39aab05094473bd07e0e48addb1454f547a69c1eb9ad1fb4 WatchSource:0}: Error finding container 5cf1b8a279fe1a6f39aab05094473bd07e0e48addb1454f547a69c1eb9ad1fb4: Status 404 returned error can't find the container with id 5cf1b8a279fe1a6f39aab05094473bd07e0e48addb1454f547a69c1eb9ad1fb4 Apr 20 19:10:40.804238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:40.804201 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" event={"ID":"adb2dcdb-56f6-44a1-8932-aebb49b373e0","Type":"ContainerStarted","Data":"caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed"} Apr 20 19:10:40.804238 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:40.804242 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" event={"ID":"adb2dcdb-56f6-44a1-8932-aebb49b373e0","Type":"ContainerStarted","Data":"5cf1b8a279fe1a6f39aab05094473bd07e0e48addb1454f547a69c1eb9ad1fb4"} Apr 20 19:10:40.804811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:40.804340 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:40.824518 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:40.824465 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" podStartSLOduration=17.824439823 podStartE2EDuration="17.824439823s" podCreationTimestamp="2026-04-20 19:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:10:40.823870929 +0000 UTC m=+147.056161438" watchObservedRunningTime="2026-04-20 19:10:40.824439823 +0000 UTC m=+147.056730331" Apr 20 19:10:41.007052 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.007020 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:41.007234 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.007092 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:41.008098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.007967 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa77ff70-05c0-428a-9836-2f6d05ddecd7-service-ca-bundle\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:41.011432 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.010043 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa77ff70-05c0-428a-9836-2f6d05ddecd7-metrics-certs\") pod \"router-default-7fb965fb56-9fs7v\" (UID: \"aa77ff70-05c0-428a-9836-2f6d05ddecd7\") " pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:41.229820 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.229791 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6cdnp\"" Apr 20 19:10:41.238149 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.238117 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:41.358202 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.358155 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7fb965fb56-9fs7v"] Apr 20 19:10:41.362175 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:41.362139 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa77ff70_05c0_428a_9836_2f6d05ddecd7.slice/crio-17a5219ea72b0cf44d50a1adf87b49557dd352e98bda06b25b818d55f9e8064c WatchSource:0}: Error finding container 17a5219ea72b0cf44d50a1adf87b49557dd352e98bda06b25b818d55f9e8064c: Status 404 returned error can't find the container with id 17a5219ea72b0cf44d50a1adf87b49557dd352e98bda06b25b818d55f9e8064c Apr 20 19:10:41.613753 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.613712 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:41.616115 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.616093 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/435b37d5-2251-4632-a312-07617c9af5af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vrlc6\" (UID: \"435b37d5-2251-4632-a312-07617c9af5af\") " pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:41.639109 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.639070 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vrlc6" Apr 20 19:10:41.757869 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.757835 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vrlc6"] Apr 20 19:10:41.760923 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:41.760893 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod435b37d5_2251_4632_a312_07617c9af5af.slice/crio-9ccf0bc3abeb646db2b0fee1a9e601688a7bb11dd47bc1e0496e0b0ea3994e1e WatchSource:0}: Error finding container 9ccf0bc3abeb646db2b0fee1a9e601688a7bb11dd47bc1e0496e0b0ea3994e1e: Status 404 returned error can't find the container with id 9ccf0bc3abeb646db2b0fee1a9e601688a7bb11dd47bc1e0496e0b0ea3994e1e Apr 20 19:10:41.808841 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.808811 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" event={"ID":"aa77ff70-05c0-428a-9836-2f6d05ddecd7","Type":"ContainerStarted","Data":"7140415537a6000348d49a8cba9e552294ce0dd340cce7f88651f473359a0bc2"} Apr 20 19:10:41.809160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.808852 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" event={"ID":"aa77ff70-05c0-428a-9836-2f6d05ddecd7","Type":"ContainerStarted","Data":"17a5219ea72b0cf44d50a1adf87b49557dd352e98bda06b25b818d55f9e8064c"} Apr 20 19:10:41.810277 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.810250 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vrlc6" event={"ID":"435b37d5-2251-4632-a312-07617c9af5af","Type":"ContainerStarted","Data":"9ccf0bc3abeb646db2b0fee1a9e601688a7bb11dd47bc1e0496e0b0ea3994e1e"} Apr 20 19:10:41.831671 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:41.831615 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" podStartSLOduration=32.831599072 podStartE2EDuration="32.831599072s" podCreationTimestamp="2026-04-20 19:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:10:41.830682489 +0000 UTC m=+148.062973000" watchObservedRunningTime="2026-04-20 19:10:41.831599072 +0000 UTC m=+148.063889579" Apr 20 19:10:42.238545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.238510 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:42.240963 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.240942 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:42.357376 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.357347 2583 scope.go:117] "RemoveContainer" containerID="9f05a4634483575b60c0700e4837cb20b2f9403581c6ad75cf0d03ca324b29c7" Apr 20 19:10:42.814007 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.813929 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:10:42.814458 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.814026 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" event={"ID":"9befbc1e-c590-441a-af58-08f4e7b6d9a4","Type":"ContainerStarted","Data":"6b0cf3e7103c1700b1bd1858e13f99b3578977327f4f4bb05f744aad62b78c06"} Apr 20 19:10:42.814458 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.814340 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:42.815832 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.815808 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vrlc6" event={"ID":"435b37d5-2251-4632-a312-07617c9af5af","Type":"ContainerStarted","Data":"8a1d8791eb2b2143b17c7565d9eacafd256389d6ba1e2454067b7e9b4874e989"} Apr 20 19:10:42.815832 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.815834 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vrlc6" event={"ID":"435b37d5-2251-4632-a312-07617c9af5af","Type":"ContainerStarted","Data":"63ac3223f7fba0af24f6fcb4c3cf4c950698555e683d0b31bd09d129621d5552"} Apr 20 19:10:42.815983 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.815973 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:42.817129 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:42.817112 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7fb965fb56-9fs7v" Apr 20 19:10:43.145389 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:43.145358 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-mlqft" Apr 20 19:10:44.822870 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:44.822836 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vrlc6" event={"ID":"435b37d5-2251-4632-a312-07617c9af5af","Type":"ContainerStarted","Data":"ec0eb1665264ac1a5750993bd392f4ca63d40f0d14a7c443ca4cc43d8e1983cf"} Apr 20 19:10:44.843268 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:44.843217 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vrlc6" podStartSLOduration=17.672438747 podStartE2EDuration="19.843198034s" podCreationTimestamp="2026-04-20 19:10:25 +0000 UTC" firstStartedPulling="2026-04-20 19:10:41.807679941 +0000 UTC m=+148.039970427" lastFinishedPulling="2026-04-20 19:10:43.978439228 +0000 UTC m=+150.210729714" observedRunningTime="2026-04-20 19:10:44.842069739 +0000 UTC m=+151.074360250" watchObservedRunningTime="2026-04-20 19:10:44.843198034 +0000 UTC m=+151.075488543" Apr 20 19:10:46.841334 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.841252 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c99c7b9fd-kzgx5"] Apr 20 19:10:46.844517 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.844492 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cff89fcd4-rcs2t"] Apr 20 19:10:46.845400 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.844763 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856013 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-864gq\" (UniqueName: \"kubernetes.io/projected/342fce42-7418-4e5d-90d8-e8160f8706f1-kube-api-access-864gq\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856056 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856059 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-console-config\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856301 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-oauth-serving-cert\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856348 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856373 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-service-ca\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856432 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-serving-cert\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856482 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-oauth-config\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856554 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856697 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pcrfv\"" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856748 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.856972 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.857167 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 19:10:46.857547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.857393 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 19:10:46.858287 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.857839 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c99c7b9fd-kzgx5"] Apr 20 19:10:46.956867 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.956832 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-864gq\" (UniqueName: \"kubernetes.io/projected/342fce42-7418-4e5d-90d8-e8160f8706f1-kube-api-access-864gq\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.956867 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.956868 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-console-config\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.957104 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.956917 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-oauth-serving-cert\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.957104 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.956936 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-service-ca\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.957104 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.956995 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-serving-cert\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.957104 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.957031 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-oauth-config\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.957687 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.957655 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-oauth-serving-cert\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.957818 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.957670 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-console-config\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.957818 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.957692 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-service-ca\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.960045 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.960028 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-oauth-config\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.960264 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.960242 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-serving-cert\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:46.968140 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:46.968113 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-864gq\" (UniqueName: \"kubernetes.io/projected/342fce42-7418-4e5d-90d8-e8160f8706f1-kube-api-access-864gq\") pod \"console-6c99c7b9fd-kzgx5\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:47.167024 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:47.166937 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:47.297221 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:47.297190 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c99c7b9fd-kzgx5"] Apr 20 19:10:47.301493 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:47.301465 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod342fce42_7418_4e5d_90d8_e8160f8706f1.slice/crio-04ea77f36c8b2469f4010ea68dee626adf2c959c6ebd15cc1eda73b6c3e866b7 WatchSource:0}: Error finding container 04ea77f36c8b2469f4010ea68dee626adf2c959c6ebd15cc1eda73b6c3e866b7: Status 404 returned error can't find the container with id 04ea77f36c8b2469f4010ea68dee626adf2c959c6ebd15cc1eda73b6c3e866b7 Apr 20 19:10:47.832746 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:47.832707 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c99c7b9fd-kzgx5" event={"ID":"342fce42-7418-4e5d-90d8-e8160f8706f1","Type":"ContainerStarted","Data":"04ea77f36c8b2469f4010ea68dee626adf2c959c6ebd15cc1eda73b6c3e866b7"} Apr 20 19:10:50.145016 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:50.144973 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-tnhlk" podUID="34e2771f-67fc-4041-92d6-4479b21afc45" Apr 20 19:10:50.161320 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:50.161276 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-w5vbd" podUID="b4d15e0b-c594-487d-9587-1c87df888ece" Apr 20 19:10:50.374704 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:50.374666 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9gbcz" podUID="45a7c0b2-25d6-499d-9c36-ca4ace9c7813" Apr 20 19:10:50.843994 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:50.843961 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:10:50.843994 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:50.843968 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c99c7b9fd-kzgx5" event={"ID":"342fce42-7418-4e5d-90d8-e8160f8706f1","Type":"ContainerStarted","Data":"789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46"} Apr 20 19:10:50.844257 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:50.844009 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tnhlk" Apr 20 19:10:50.861273 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:50.861230 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c99c7b9fd-kzgx5" podStartSLOduration=2.344881759 podStartE2EDuration="4.861219252s" podCreationTimestamp="2026-04-20 19:10:46 +0000 UTC" firstStartedPulling="2026-04-20 19:10:47.303261537 +0000 UTC m=+153.535552023" lastFinishedPulling="2026-04-20 19:10:49.819599031 +0000 UTC m=+156.051889516" observedRunningTime="2026-04-20 19:10:50.860357084 +0000 UTC m=+157.092647589" watchObservedRunningTime="2026-04-20 19:10:50.861219252 +0000 UTC m=+157.093509760" Apr 20 19:10:53.971282 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.971241 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh"] Apr 20 19:10:53.974757 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.974734 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:53.977884 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.977863 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 19:10:53.978003 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.977864 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:10:53.978003 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.977864 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-fd5gz\"" Apr 20 19:10:53.978003 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.977927 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 19:10:53.979112 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.979096 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:10:53.979166 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.979118 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:10:53.991015 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:53.990994 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh"] Apr 20 19:10:54.020110 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.020083 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/521e9e92-3741-45e7-9ca3-a8f8e0338cec-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.020211 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.020116 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/521e9e92-3741-45e7-9ca3-a8f8e0338cec-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.020211 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.020153 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4m2c\" (UniqueName: \"kubernetes.io/projected/521e9e92-3741-45e7-9ca3-a8f8e0338cec-kube-api-access-f4m2c\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.020211 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.020177 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/521e9e92-3741-45e7-9ca3-a8f8e0338cec-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.050504 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.050472 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xtgnn"] Apr 20 19:10:54.053581 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.053562 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.061466 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.061445 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:10:54.061740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.061726 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lszj4\"" Apr 20 19:10:54.061987 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.061972 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:10:54.063084 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.063070 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:10:54.121273 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121242 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-sys\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.121484 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121279 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-metrics-client-ca\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.121484 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121299 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dnvc\" (UniqueName: \"kubernetes.io/projected/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-kube-api-access-9dnvc\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.121484 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121374 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/521e9e92-3741-45e7-9ca3-a8f8e0338cec-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.121484 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121424 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.121484 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121455 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/521e9e92-3741-45e7-9ca3-a8f8e0338cec-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.121747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121521 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4m2c\" (UniqueName: \"kubernetes.io/projected/521e9e92-3741-45e7-9ca3-a8f8e0338cec-kube-api-access-f4m2c\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.121747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121551 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.121747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121583 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-textfile\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.121747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121618 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/521e9e92-3741-45e7-9ca3-a8f8e0338cec-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.121747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121726 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-root\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.121938 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121757 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-wtmp\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.121938 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121785 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.122003 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.121963 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/521e9e92-3741-45e7-9ca3-a8f8e0338cec-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.124621 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.124588 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/521e9e92-3741-45e7-9ca3-a8f8e0338cec-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.124738 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.124679 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/521e9e92-3741-45e7-9ca3-a8f8e0338cec-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.129688 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.129663 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4m2c\" (UniqueName: \"kubernetes.io/projected/521e9e92-3741-45e7-9ca3-a8f8e0338cec-kube-api-access-f4m2c\") pod \"openshift-state-metrics-9d44df66c-nbtnh\" (UID: \"521e9e92-3741-45e7-9ca3-a8f8e0338cec\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.223177 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223092 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-textfile\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223177 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223160 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-root\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223184 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-wtmp\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223208 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223227 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-root\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223242 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-sys\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223292 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-sys\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223340 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-metrics-client-ca\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223374 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dnvc\" (UniqueName: \"kubernetes.io/projected/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-kube-api-access-9dnvc\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223392 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-wtmp\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223431 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223422 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223895 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223474 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223895 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223485 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-textfile\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.223895 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:54.223561 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:10:54.223895 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:54.223629 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls podName:05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:54.723610677 +0000 UTC m=+160.955901164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls") pod "node-exporter-xtgnn" (UID: "05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86") : secret "node-exporter-tls" not found Apr 20 19:10:54.223895 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223794 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-accelerators-collector-config\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.224070 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.223923 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-metrics-client-ca\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.226025 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.226008 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.234011 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.233987 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dnvc\" (UniqueName: \"kubernetes.io/projected/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-kube-api-access-9dnvc\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.283852 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.283826 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" Apr 20 19:10:54.405824 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.405790 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh"] Apr 20 19:10:54.408905 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:54.408876 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod521e9e92_3741_45e7_9ca3_a8f8e0338cec.slice/crio-e105a620bdbe74c1694525233c836b795375a1e20cedfe3f4039cc475af3b5ec WatchSource:0}: Error finding container e105a620bdbe74c1694525233c836b795375a1e20cedfe3f4039cc475af3b5ec: Status 404 returned error can't find the container with id e105a620bdbe74c1694525233c836b795375a1e20cedfe3f4039cc475af3b5ec Apr 20 19:10:54.726661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.726586 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:54.726826 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:54.726731 2583 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:10:54.726826 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:10:54.726796 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls podName:05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:55.726774447 +0000 UTC m=+161.959064936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls") pod "node-exporter-xtgnn" (UID: "05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86") : secret "node-exporter-tls" not found Apr 20 19:10:54.855612 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.855581 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" event={"ID":"521e9e92-3741-45e7-9ca3-a8f8e0338cec","Type":"ContainerStarted","Data":"ac66dd7007080d4afd8a57915f871c1c83b3bc974371993dcd46e9a6f2b80998"} Apr 20 19:10:54.855756 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.855620 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" event={"ID":"521e9e92-3741-45e7-9ca3-a8f8e0338cec","Type":"ContainerStarted","Data":"f423295619ec4335c0e76415b2c7c21d4d859625d978e1d0111b16a1509c7749"} Apr 20 19:10:54.855756 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:54.855642 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" event={"ID":"521e9e92-3741-45e7-9ca3-a8f8e0338cec","Type":"ContainerStarted","Data":"e105a620bdbe74c1694525233c836b795375a1e20cedfe3f4039cc475af3b5ec"} Apr 20 19:10:55.029328 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.029227 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:10:55.029328 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.029259 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:10:55.031766 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.031740 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34e2771f-67fc-4041-92d6-4479b21afc45-metrics-tls\") pod \"dns-default-tnhlk\" (UID: \"34e2771f-67fc-4041-92d6-4479b21afc45\") " pod="openshift-dns/dns-default-tnhlk" Apr 20 19:10:55.031877 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.031797 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4d15e0b-c594-487d-9587-1c87df888ece-cert\") pod \"ingress-canary-w5vbd\" (UID: \"b4d15e0b-c594-487d-9587-1c87df888ece\") " pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:10:55.047922 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.047895 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ggf2q\"" Apr 20 19:10:55.048068 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.047925 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gtxhr\"" Apr 20 19:10:55.056020 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.056000 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tnhlk" Apr 20 19:10:55.056094 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.056000 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w5vbd" Apr 20 19:10:55.190113 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.190084 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tnhlk"] Apr 20 19:10:55.194233 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:55.194202 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34e2771f_67fc_4041_92d6_4479b21afc45.slice/crio-e08b66cff40a1877cb7a95482a1bc63f32aef7573ff72ae8edd6fad50484dbe8 WatchSource:0}: Error finding container e08b66cff40a1877cb7a95482a1bc63f32aef7573ff72ae8edd6fad50484dbe8: Status 404 returned error can't find the container with id e08b66cff40a1877cb7a95482a1bc63f32aef7573ff72ae8edd6fad50484dbe8 Apr 20 19:10:55.212616 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.212581 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w5vbd"] Apr 20 19:10:55.215897 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:55.215873 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4d15e0b_c594_487d_9587_1c87df888ece.slice/crio-03209fe770bea4a6cbf8594b6198b0f564bfb5dbe9f72e95917d7fe748bbef8a WatchSource:0}: Error finding container 03209fe770bea4a6cbf8594b6198b0f564bfb5dbe9f72e95917d7fe748bbef8a: Status 404 returned error can't find the container with id 03209fe770bea4a6cbf8594b6198b0f564bfb5dbe9f72e95917d7fe748bbef8a Apr 20 19:10:55.735300 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.735267 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:55.737707 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.737681 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86-node-exporter-tls\") pod \"node-exporter-xtgnn\" (UID: \"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86\") " pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:55.858914 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.858874 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w5vbd" event={"ID":"b4d15e0b-c594-487d-9587-1c87df888ece","Type":"ContainerStarted","Data":"03209fe770bea4a6cbf8594b6198b0f564bfb5dbe9f72e95917d7fe748bbef8a"} Apr 20 19:10:55.859847 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.859823 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tnhlk" event={"ID":"34e2771f-67fc-4041-92d6-4479b21afc45","Type":"ContainerStarted","Data":"e08b66cff40a1877cb7a95482a1bc63f32aef7573ff72ae8edd6fad50484dbe8"} Apr 20 19:10:55.862067 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.862053 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xtgnn" Apr 20 19:10:55.969276 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.969241 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5b866f9874-rmh9f"] Apr 20 19:10:55.994594 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.994520 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5b866f9874-rmh9f"] Apr 20 19:10:55.994757 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.994699 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:55.997916 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.997768 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-efdfef92k2rp\"" Apr 20 19:10:55.997916 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.997779 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-js59j\"" Apr 20 19:10:55.997916 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.997779 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 19:10:55.998169 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.998055 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 19:10:55.998169 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.998122 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 19:10:55.998356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.998236 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 19:10:55.998539 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:55.998443 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 19:10:56.030358 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.030319 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-fd8db9c6-8csww"] Apr 20 19:10:56.037423 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.037394 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.037557 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.037443 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.037557 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.037475 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdd2c82f-fd4f-4928-812f-989d513cf8f6-metrics-client-ca\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.037557 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.037526 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.037716 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.037588 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.037716 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.037615 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-tls\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.037822 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.037707 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-grpc-tls\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.037822 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.037742 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxjs\" (UniqueName: \"kubernetes.io/projected/fdd2c82f-fd4f-4928-812f-989d513cf8f6-kube-api-access-jlxjs\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.046324 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.046278 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fd8db9c6-8csww"] Apr 20 19:10:56.046440 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.046424 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.054167 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.054143 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 19:10:56.081354 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:56.081297 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05f31ef0_ffd3_4e69_925a_2f9cf6f8fc86.slice/crio-6c8bbc232705d8b7c142e94eea62613e57e3b29b760edc71338c673250ed2737 WatchSource:0}: Error finding container 6c8bbc232705d8b7c142e94eea62613e57e3b29b760edc71338c673250ed2737: Status 404 returned error can't find the container with id 6c8bbc232705d8b7c142e94eea62613e57e3b29b760edc71338c673250ed2737 Apr 20 19:10:56.138582 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138550 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-oauth-serving-cert\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.138707 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138594 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-grpc-tls\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.138707 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138619 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxjs\" (UniqueName: \"kubernetes.io/projected/fdd2c82f-fd4f-4928-812f-989d513cf8f6-kube-api-access-jlxjs\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.138707 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138665 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.138707 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138700 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.138988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138728 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-service-ca\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.138988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138750 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-trusted-ca-bundle\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.138988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138781 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdd2c82f-fd4f-4928-812f-989d513cf8f6-metrics-client-ca\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.138988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138807 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-console-config\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.138988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138877 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.138988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138904 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-oauth-config\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.138988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138940 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.138988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.138966 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djv5f\" (UniqueName: \"kubernetes.io/projected/05575f5f-797a-41a7-8bf7-335d16b08901-kube-api-access-djv5f\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.139387 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.139003 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-tls\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.139387 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.139035 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-serving-cert\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.139861 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.139843 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fdd2c82f-fd4f-4928-812f-989d513cf8f6-metrics-client-ca\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.146038 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.146005 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-tls\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.146416 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.146391 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.146548 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.146518 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.147143 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.147098 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.148975 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.148935 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.149608 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.149589 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fdd2c82f-fd4f-4928-812f-989d513cf8f6-secret-grpc-tls\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.150718 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.150633 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxjs\" (UniqueName: \"kubernetes.io/projected/fdd2c82f-fd4f-4928-812f-989d513cf8f6-kube-api-access-jlxjs\") pod \"thanos-querier-5b866f9874-rmh9f\" (UID: \"fdd2c82f-fd4f-4928-812f-989d513cf8f6\") " pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.239933 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.239899 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-oauth-config\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.240081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.239951 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djv5f\" (UniqueName: \"kubernetes.io/projected/05575f5f-797a-41a7-8bf7-335d16b08901-kube-api-access-djv5f\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.240081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.239993 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-serving-cert\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.240081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.240037 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-oauth-serving-cert\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.240508 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.240471 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-service-ca\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.240508 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.240520 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-trusted-ca-bundle\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.240769 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.240560 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-console-config\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.241000 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.240975 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-oauth-serving-cert\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.241175 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.241152 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-service-ca\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.241338 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.241278 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-console-config\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.241875 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.241835 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-trusted-ca-bundle\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.243002 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.242960 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-oauth-config\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.244762 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.244688 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-serving-cert\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.259117 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.259063 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djv5f\" (UniqueName: \"kubernetes.io/projected/05575f5f-797a-41a7-8bf7-335d16b08901-kube-api-access-djv5f\") pod \"console-fd8db9c6-8csww\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.306763 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.306728 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:10:56.365514 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.365484 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:10:56.473529 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.473491 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5b866f9874-rmh9f"] Apr 20 19:10:56.481566 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:56.481532 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd2c82f_fd4f_4928_812f_989d513cf8f6.slice/crio-26d0d9199c890f3182e5d93364632518ff1a5f89b0570010cdcaec5e141b874f WatchSource:0}: Error finding container 26d0d9199c890f3182e5d93364632518ff1a5f89b0570010cdcaec5e141b874f: Status 404 returned error can't find the container with id 26d0d9199c890f3182e5d93364632518ff1a5f89b0570010cdcaec5e141b874f Apr 20 19:10:56.543720 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.543213 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-fd8db9c6-8csww"] Apr 20 19:10:56.545720 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:10:56.545679 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05575f5f_797a_41a7_8bf7_335d16b08901.slice/crio-286df6f1d233fc0bc234dd1e375785798211ad6c4b32ba0c98c03ff782528cfd WatchSource:0}: Error finding container 286df6f1d233fc0bc234dd1e375785798211ad6c4b32ba0c98c03ff782528cfd: Status 404 returned error can't find the container with id 286df6f1d233fc0bc234dd1e375785798211ad6c4b32ba0c98c03ff782528cfd Apr 20 19:10:56.851987 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.851958 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:10:56.866185 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.866147 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" event={"ID":"521e9e92-3741-45e7-9ca3-a8f8e0338cec","Type":"ContainerStarted","Data":"6a5f23d41d4042cdab0c256725cfc3e7937bfbb2bb4872079327a09836af6426"} Apr 20 19:10:56.869190 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.868616 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fd8db9c6-8csww" event={"ID":"05575f5f-797a-41a7-8bf7-335d16b08901","Type":"ContainerStarted","Data":"4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e"} Apr 20 19:10:56.869190 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.868654 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fd8db9c6-8csww" event={"ID":"05575f5f-797a-41a7-8bf7-335d16b08901","Type":"ContainerStarted","Data":"286df6f1d233fc0bc234dd1e375785798211ad6c4b32ba0c98c03ff782528cfd"} Apr 20 19:10:56.870473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.870444 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" event={"ID":"fdd2c82f-fd4f-4928-812f-989d513cf8f6","Type":"ContainerStarted","Data":"26d0d9199c890f3182e5d93364632518ff1a5f89b0570010cdcaec5e141b874f"} Apr 20 19:10:56.871614 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.871592 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtgnn" event={"ID":"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86","Type":"ContainerStarted","Data":"6c8bbc232705d8b7c142e94eea62613e57e3b29b760edc71338c673250ed2737"} Apr 20 19:10:56.892985 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.892938 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-nbtnh" podStartSLOduration=2.362692022 podStartE2EDuration="3.892923174s" podCreationTimestamp="2026-04-20 19:10:53 +0000 UTC" firstStartedPulling="2026-04-20 19:10:54.549565683 +0000 UTC m=+160.781856169" lastFinishedPulling="2026-04-20 19:10:56.079796829 +0000 UTC m=+162.312087321" observedRunningTime="2026-04-20 19:10:56.891603057 +0000 UTC m=+163.123893590" watchObservedRunningTime="2026-04-20 19:10:56.892923174 +0000 UTC m=+163.125213681" Apr 20 19:10:56.909005 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:56.908962 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-fd8db9c6-8csww" podStartSLOduration=0.908942541 podStartE2EDuration="908.942541ms" podCreationTimestamp="2026-04-20 19:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:10:56.9086103 +0000 UTC m=+163.140900808" watchObservedRunningTime="2026-04-20 19:10:56.908942541 +0000 UTC m=+163.141233049" Apr 20 19:10:57.167930 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:57.167855 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:57.167930 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:57.167917 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:57.173602 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:57.173578 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:57.880699 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:57.880667 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:10:58.879569 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:58.879528 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w5vbd" event={"ID":"b4d15e0b-c594-487d-9587-1c87df888ece","Type":"ContainerStarted","Data":"7ee4ef9b0fd957f307049355c8b307804add50c817d7d82483b1512749cfa87a"} Apr 20 19:10:58.881637 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:58.881612 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tnhlk" event={"ID":"34e2771f-67fc-4041-92d6-4479b21afc45","Type":"ContainerStarted","Data":"1bda9c8d82ee8f383244a20b4f472b36330f069caeee1dcbd38a5188275a3755"} Apr 20 19:10:58.881775 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:58.881645 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tnhlk" event={"ID":"34e2771f-67fc-4041-92d6-4479b21afc45","Type":"ContainerStarted","Data":"912468c4b23c80457caab2f90bcbe5f60e6e45650db80d80ea16195baa1bc754"} Apr 20 19:10:58.881851 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:58.881779 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tnhlk" Apr 20 19:10:58.883142 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:58.883119 2583 generic.go:358] "Generic (PLEG): container finished" podID="05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86" containerID="a163f4b64810d09488b0b98a19ef93db752c4017c8c344485605e7a4c5f67b01" exitCode=0 Apr 20 19:10:58.883254 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:58.883206 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtgnn" event={"ID":"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86","Type":"ContainerDied","Data":"a163f4b64810d09488b0b98a19ef93db752c4017c8c344485605e7a4c5f67b01"} Apr 20 19:10:58.896528 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:58.896484 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w5vbd" podStartSLOduration=129.075923483 podStartE2EDuration="2m11.896472257s" podCreationTimestamp="2026-04-20 19:08:47 +0000 UTC" firstStartedPulling="2026-04-20 19:10:55.218606884 +0000 UTC m=+161.450897370" lastFinishedPulling="2026-04-20 19:10:58.039155656 +0000 UTC m=+164.271446144" observedRunningTime="2026-04-20 19:10:58.895740366 +0000 UTC m=+165.128030854" watchObservedRunningTime="2026-04-20 19:10:58.896472257 +0000 UTC m=+165.128762758" Apr 20 19:10:58.930898 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:58.930842 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tnhlk" podStartSLOduration=129.094042674 podStartE2EDuration="2m11.930822888s" podCreationTimestamp="2026-04-20 19:08:47 +0000 UTC" firstStartedPulling="2026-04-20 19:10:55.196517282 +0000 UTC m=+161.428807772" lastFinishedPulling="2026-04-20 19:10:58.033297485 +0000 UTC m=+164.265587986" observedRunningTime="2026-04-20 19:10:58.930276808 +0000 UTC m=+165.162567316" watchObservedRunningTime="2026-04-20 19:10:58.930822888 +0000 UTC m=+165.163113397" Apr 20 19:10:59.888271 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:59.888237 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtgnn" event={"ID":"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86","Type":"ContainerStarted","Data":"894f082a34e18aca5e34a33b80e1bc3f481e133a6753ede474191c7d30f1fd40"} Apr 20 19:10:59.888704 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:59.888278 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xtgnn" event={"ID":"05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86","Type":"ContainerStarted","Data":"b507f14c50d1cefd39370961fac202563cdb0ea2351d501bd20f5984b242f15d"} Apr 20 19:10:59.889981 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:59.889958 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" event={"ID":"fdd2c82f-fd4f-4928-812f-989d513cf8f6","Type":"ContainerStarted","Data":"7ff7b4859c689129e328b7a199015e2256aece1f94f018073f2084bdf49e3d20"} Apr 20 19:10:59.890074 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:59.889985 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" event={"ID":"fdd2c82f-fd4f-4928-812f-989d513cf8f6","Type":"ContainerStarted","Data":"b7aa5ce882882b74cfb7368bf69b1c0d9a84c9ec80cb44d67d7d9ec5b24dba78"} Apr 20 19:10:59.890074 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:59.889994 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" event={"ID":"fdd2c82f-fd4f-4928-812f-989d513cf8f6","Type":"ContainerStarted","Data":"6e39dea73afd5ebc1649ed60ecc8cd77e5bfae64615005cb16c6f12abb62b5af"} Apr 20 19:10:59.910820 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:10:59.910782 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xtgnn" podStartSLOduration=3.96206548 podStartE2EDuration="5.910767841s" podCreationTimestamp="2026-04-20 19:10:54 +0000 UTC" firstStartedPulling="2026-04-20 19:10:56.082890439 +0000 UTC m=+162.315180932" lastFinishedPulling="2026-04-20 19:10:58.031592807 +0000 UTC m=+164.263883293" observedRunningTime="2026-04-20 19:10:59.908766327 +0000 UTC m=+166.141056835" watchObservedRunningTime="2026-04-20 19:10:59.910767841 +0000 UTC m=+166.143058351" Apr 20 19:11:00.896535 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:00.896458 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" event={"ID":"fdd2c82f-fd4f-4928-812f-989d513cf8f6","Type":"ContainerStarted","Data":"06eb54c8dc0c7d4bb49c614e7cc84562ecda7a89710495265414b6396798ded4"} Apr 20 19:11:00.896535 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:00.896497 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" event={"ID":"fdd2c82f-fd4f-4928-812f-989d513cf8f6","Type":"ContainerStarted","Data":"deaf48c4638760e56464b8416e0324b116bb8fb6b596347e81149c1a05e95075"} Apr 20 19:11:00.896535 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:00.896514 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" event={"ID":"fdd2c82f-fd4f-4928-812f-989d513cf8f6","Type":"ContainerStarted","Data":"75a1f5d4cb2cf12579378e3c0b98f8a3c8a4bba71e64a85f4f7917244973eeb7"} Apr 20 19:11:00.921323 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:00.921260 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" podStartSLOduration=2.184827824 podStartE2EDuration="5.921247536s" podCreationTimestamp="2026-04-20 19:10:55 +0000 UTC" firstStartedPulling="2026-04-20 19:10:56.484135576 +0000 UTC m=+162.716426062" lastFinishedPulling="2026-04-20 19:11:00.220555287 +0000 UTC m=+166.452845774" observedRunningTime="2026-04-20 19:11:00.919289085 +0000 UTC m=+167.151579592" watchObservedRunningTime="2026-04-20 19:11:00.921247536 +0000 UTC m=+167.153538072" Apr 20 19:11:01.900427 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:01.900389 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:11:04.041012 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.040967 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fd8db9c6-8csww"] Apr 20 19:11:04.068636 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.068609 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-778b785d57-zz64k"] Apr 20 19:11:04.071933 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.071913 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.084287 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.084258 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-778b785d57-zz64k"] Apr 20 19:11:04.215872 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.215834 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-config\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.216052 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.215889 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-oauth-config\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.216052 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.215934 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-oauth-serving-cert\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.216052 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.215958 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxp22\" (UniqueName: \"kubernetes.io/projected/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-kube-api-access-gxp22\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.216052 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.215977 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-service-ca\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.216187 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.216049 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-serving-cert\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.216187 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.216112 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-trusted-ca-bundle\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.316945 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.316868 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxp22\" (UniqueName: \"kubernetes.io/projected/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-kube-api-access-gxp22\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.316945 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.316910 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-service-ca\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.317084 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.316960 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-serving-cert\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.317084 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.316993 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-trusted-ca-bundle\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.317256 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.317227 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-config\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.317426 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.317329 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-oauth-config\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.317426 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.317357 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-oauth-serving-cert\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.318093 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.318069 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-service-ca\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.318226 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.318204 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-config\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.318476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.318444 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-oauth-serving-cert\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.318667 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.318646 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-trusted-ca-bundle\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.319967 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.319947 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-serving-cert\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.320519 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.320501 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-oauth-config\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.326030 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.326004 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxp22\" (UniqueName: \"kubernetes.io/projected/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-kube-api-access-gxp22\") pod \"console-778b785d57-zz64k\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.358871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.358844 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:11:04.380115 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.380091 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:04.496829 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.496800 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-778b785d57-zz64k"] Apr 20 19:11:04.499640 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:11:04.499610 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a1758b3_5e5c_4cc5_b7a4_c4ac7eb5a6eb.slice/crio-1b90eee737cd0e6377c93753e28a8d1339b3081ff7012cd25c4e7f41ffdc066e WatchSource:0}: Error finding container 1b90eee737cd0e6377c93753e28a8d1339b3081ff7012cd25c4e7f41ffdc066e: Status 404 returned error can't find the container with id 1b90eee737cd0e6377c93753e28a8d1339b3081ff7012cd25c4e7f41ffdc066e Apr 20 19:11:04.910277 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.910241 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778b785d57-zz64k" event={"ID":"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb","Type":"ContainerStarted","Data":"09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f"} Apr 20 19:11:04.910461 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.910283 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778b785d57-zz64k" event={"ID":"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb","Type":"ContainerStarted","Data":"1b90eee737cd0e6377c93753e28a8d1339b3081ff7012cd25c4e7f41ffdc066e"} Apr 20 19:11:04.952618 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:04.952571 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-778b785d57-zz64k" podStartSLOduration=0.952557699 podStartE2EDuration="952.557699ms" podCreationTimestamp="2026-04-20 19:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:11:04.951477968 +0000 UTC m=+171.183768476" watchObservedRunningTime="2026-04-20 19:11:04.952557699 +0000 UTC m=+171.184848206" Apr 20 19:11:06.366463 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:06.366429 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:11:07.910995 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:07.910968 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5b866f9874-rmh9f" Apr 20 19:11:08.892410 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:08.892380 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tnhlk" Apr 20 19:11:11.873220 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:11.873177 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" podUID="adb2dcdb-56f6-44a1-8932-aebb49b373e0" containerName="registry" containerID="cri-o://caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed" gracePeriod=30 Apr 20 19:11:12.112655 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.112628 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:11:12.189300 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189221 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-bound-sa-token\") pod \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " Apr 20 19:11:12.189300 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189261 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvll7\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-kube-api-access-cvll7\") pod \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " Apr 20 19:11:12.189300 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189290 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-certificates\") pod \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " Apr 20 19:11:12.189551 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189438 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-installation-pull-secrets\") pod \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " Apr 20 19:11:12.189551 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189543 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/adb2dcdb-56f6-44a1-8932-aebb49b373e0-ca-trust-extracted\") pod \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " Apr 20 19:11:12.189631 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189575 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-image-registry-private-configuration\") pod \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " Apr 20 19:11:12.189631 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189603 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-trusted-ca\") pod \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " Apr 20 19:11:12.189728 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189643 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") pod \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\" (UID: \"adb2dcdb-56f6-44a1-8932-aebb49b373e0\") " Apr 20 19:11:12.189783 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.189726 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "adb2dcdb-56f6-44a1-8932-aebb49b373e0" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:12.190135 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.190090 2583 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-certificates\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:12.190279 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.190186 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "adb2dcdb-56f6-44a1-8932-aebb49b373e0" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:12.192054 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.192020 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "adb2dcdb-56f6-44a1-8932-aebb49b373e0" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:11:12.192253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.192226 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "adb2dcdb-56f6-44a1-8932-aebb49b373e0" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:11:12.192253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.192235 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "adb2dcdb-56f6-44a1-8932-aebb49b373e0" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:11:12.192422 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.192253 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-kube-api-access-cvll7" (OuterVolumeSpecName: "kube-api-access-cvll7") pod "adb2dcdb-56f6-44a1-8932-aebb49b373e0" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0"). InnerVolumeSpecName "kube-api-access-cvll7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:11:12.192462 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.192442 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "adb2dcdb-56f6-44a1-8932-aebb49b373e0" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:11:12.201470 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.201445 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb2dcdb-56f6-44a1-8932-aebb49b373e0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "adb2dcdb-56f6-44a1-8932-aebb49b373e0" (UID: "adb2dcdb-56f6-44a1-8932-aebb49b373e0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:11:12.291324 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.291272 2583 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/adb2dcdb-56f6-44a1-8932-aebb49b373e0-ca-trust-extracted\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:12.291324 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.291327 2583 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-image-registry-private-configuration\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:12.291500 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.291344 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/adb2dcdb-56f6-44a1-8932-aebb49b373e0-trusted-ca\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:12.291500 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.291358 2583 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-registry-tls\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:12.291500 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.291369 2583 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-bound-sa-token\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:12.291500 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.291377 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cvll7\" (UniqueName: \"kubernetes.io/projected/adb2dcdb-56f6-44a1-8932-aebb49b373e0-kube-api-access-cvll7\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:12.291500 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.291387 2583 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/adb2dcdb-56f6-44a1-8932-aebb49b373e0-installation-pull-secrets\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:12.935033 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.934994 2583 generic.go:358] "Generic (PLEG): container finished" podID="adb2dcdb-56f6-44a1-8932-aebb49b373e0" containerID="caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed" exitCode=0 Apr 20 19:11:12.935459 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.935037 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" event={"ID":"adb2dcdb-56f6-44a1-8932-aebb49b373e0","Type":"ContainerDied","Data":"caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed"} Apr 20 19:11:12.935459 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.935060 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" Apr 20 19:11:12.935459 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.935066 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cff89fcd4-rcs2t" event={"ID":"adb2dcdb-56f6-44a1-8932-aebb49b373e0","Type":"ContainerDied","Data":"5cf1b8a279fe1a6f39aab05094473bd07e0e48addb1454f547a69c1eb9ad1fb4"} Apr 20 19:11:12.935459 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.935086 2583 scope.go:117] "RemoveContainer" containerID="caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed" Apr 20 19:11:12.943276 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.943251 2583 scope.go:117] "RemoveContainer" containerID="caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed" Apr 20 19:11:12.943878 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:11:12.943586 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed\": container with ID starting with caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed not found: ID does not exist" containerID="caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed" Apr 20 19:11:12.944104 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.944031 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed"} err="failed to get container status \"caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed\": rpc error: code = NotFound desc = could not find container \"caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed\": container with ID starting with caebccab7a7f085ca0649c1880179193969767cf7cfff5917b6affffffd37aed not found: ID does not exist" Apr 20 19:11:12.957512 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.957483 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cff89fcd4-rcs2t"] Apr 20 19:11:12.961270 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:12.961245 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5cff89fcd4-rcs2t"] Apr 20 19:11:14.361385 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:14.361350 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb2dcdb-56f6-44a1-8932-aebb49b373e0" path="/var/lib/kubelet/pods/adb2dcdb-56f6-44a1-8932-aebb49b373e0/volumes" Apr 20 19:11:14.380323 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:14.380283 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:14.380480 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:14.380361 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:14.385442 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:14.385422 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:14.946250 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:14.946223 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:11:14.992820 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:14.992788 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c99c7b9fd-kzgx5"] Apr 20 19:11:29.060454 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.060389 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-fd8db9c6-8csww" podUID="05575f5f-797a-41a7-8bf7-335d16b08901" containerName="console" containerID="cri-o://4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e" gracePeriod=15 Apr 20 19:11:29.328421 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.328399 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fd8db9c6-8csww_05575f5f-797a-41a7-8bf7-335d16b08901/console/0.log" Apr 20 19:11:29.328537 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.328459 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:11:29.429518 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.429480 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-oauth-config\") pod \"05575f5f-797a-41a7-8bf7-335d16b08901\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " Apr 20 19:11:29.429518 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.429525 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-service-ca\") pod \"05575f5f-797a-41a7-8bf7-335d16b08901\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " Apr 20 19:11:29.429748 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.429543 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djv5f\" (UniqueName: \"kubernetes.io/projected/05575f5f-797a-41a7-8bf7-335d16b08901-kube-api-access-djv5f\") pod \"05575f5f-797a-41a7-8bf7-335d16b08901\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " Apr 20 19:11:29.429748 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.429575 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-serving-cert\") pod \"05575f5f-797a-41a7-8bf7-335d16b08901\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " Apr 20 19:11:29.430564 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.429887 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-oauth-serving-cert\") pod \"05575f5f-797a-41a7-8bf7-335d16b08901\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " Apr 20 19:11:29.430564 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.430239 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-service-ca" (OuterVolumeSpecName: "service-ca") pod "05575f5f-797a-41a7-8bf7-335d16b08901" (UID: "05575f5f-797a-41a7-8bf7-335d16b08901"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:29.430564 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.430272 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-console-config\") pod \"05575f5f-797a-41a7-8bf7-335d16b08901\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " Apr 20 19:11:29.430564 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.430429 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-trusted-ca-bundle\") pod \"05575f5f-797a-41a7-8bf7-335d16b08901\" (UID: \"05575f5f-797a-41a7-8bf7-335d16b08901\") " Apr 20 19:11:29.430564 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.430500 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "05575f5f-797a-41a7-8bf7-335d16b08901" (UID: "05575f5f-797a-41a7-8bf7-335d16b08901"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:29.433620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.430860 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-console-config" (OuterVolumeSpecName: "console-config") pod "05575f5f-797a-41a7-8bf7-335d16b08901" (UID: "05575f5f-797a-41a7-8bf7-335d16b08901"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:29.433620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.431138 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "05575f5f-797a-41a7-8bf7-335d16b08901" (UID: "05575f5f-797a-41a7-8bf7-335d16b08901"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:29.433620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.431667 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-service-ca\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:29.433620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.431690 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-oauth-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:29.433620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.431707 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-console-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:29.436202 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.436171 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "05575f5f-797a-41a7-8bf7-335d16b08901" (UID: "05575f5f-797a-41a7-8bf7-335d16b08901"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:11:29.436202 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.436180 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05575f5f-797a-41a7-8bf7-335d16b08901-kube-api-access-djv5f" (OuterVolumeSpecName: "kube-api-access-djv5f") pod "05575f5f-797a-41a7-8bf7-335d16b08901" (UID: "05575f5f-797a-41a7-8bf7-335d16b08901"). InnerVolumeSpecName "kube-api-access-djv5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:11:29.436357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.436185 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "05575f5f-797a-41a7-8bf7-335d16b08901" (UID: "05575f5f-797a-41a7-8bf7-335d16b08901"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:11:29.532915 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.532876 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-oauth-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:29.532915 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.532910 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-djv5f\" (UniqueName: \"kubernetes.io/projected/05575f5f-797a-41a7-8bf7-335d16b08901-kube-api-access-djv5f\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:29.532915 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.532925 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/05575f5f-797a-41a7-8bf7-335d16b08901-console-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:29.533133 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.532937 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05575f5f-797a-41a7-8bf7-335d16b08901-trusted-ca-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:29.982947 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.982917 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-fd8db9c6-8csww_05575f5f-797a-41a7-8bf7-335d16b08901/console/0.log" Apr 20 19:11:29.983101 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.982960 2583 generic.go:358] "Generic (PLEG): container finished" podID="05575f5f-797a-41a7-8bf7-335d16b08901" containerID="4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e" exitCode=2 Apr 20 19:11:29.983101 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.983040 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fd8db9c6-8csww" event={"ID":"05575f5f-797a-41a7-8bf7-335d16b08901","Type":"ContainerDied","Data":"4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e"} Apr 20 19:11:29.983101 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.983066 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-fd8db9c6-8csww" event={"ID":"05575f5f-797a-41a7-8bf7-335d16b08901","Type":"ContainerDied","Data":"286df6f1d233fc0bc234dd1e375785798211ad6c4b32ba0c98c03ff782528cfd"} Apr 20 19:11:29.983101 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.983065 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-fd8db9c6-8csww" Apr 20 19:11:29.983101 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:29.983080 2583 scope.go:117] "RemoveContainer" containerID="4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e" Apr 20 19:11:30.003113 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:30.003090 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-fd8db9c6-8csww"] Apr 20 19:11:30.007109 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:30.007088 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-fd8db9c6-8csww"] Apr 20 19:11:30.008062 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:30.008044 2583 scope.go:117] "RemoveContainer" containerID="4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e" Apr 20 19:11:30.008415 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:11:30.008301 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e\": container with ID starting with 4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e not found: ID does not exist" containerID="4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e" Apr 20 19:11:30.008483 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:30.008426 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e"} err="failed to get container status \"4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e\": rpc error: code = NotFound desc = could not find container \"4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e\": container with ID starting with 4814d80fd36019a4e88408c3c41fe846ed9e9adef485cdc0b571745b4065535e not found: ID does not exist" Apr 20 19:11:30.361379 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:30.361343 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05575f5f-797a-41a7-8bf7-335d16b08901" path="/var/lib/kubelet/pods/05575f5f-797a-41a7-8bf7-335d16b08901/volumes" Apr 20 19:11:38.007944 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:38.007910 2583 generic.go:358] "Generic (PLEG): container finished" podID="7df06148-0d5a-4e8d-ba9d-6f7e64734e95" containerID="004aa0dafd544051b603a5f8d425adefdc57bfa751eed118181512f8dfe9db0b" exitCode=0 Apr 20 19:11:38.008364 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:38.007983 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" event={"ID":"7df06148-0d5a-4e8d-ba9d-6f7e64734e95","Type":"ContainerDied","Data":"004aa0dafd544051b603a5f8d425adefdc57bfa751eed118181512f8dfe9db0b"} Apr 20 19:11:38.008364 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:38.008293 2583 scope.go:117] "RemoveContainer" containerID="004aa0dafd544051b603a5f8d425adefdc57bfa751eed118181512f8dfe9db0b" Apr 20 19:11:39.012032 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:39.012000 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-b5mfm" event={"ID":"7df06148-0d5a-4e8d-ba9d-6f7e64734e95","Type":"ContainerStarted","Data":"f5925cd46921d40380173fd94e523547165eee788e6ad6a4b4b3b6e3bca400f2"} Apr 20 19:11:40.013277 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.013228 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c99c7b9fd-kzgx5" podUID="342fce42-7418-4e5d-90d8-e8160f8706f1" containerName="console" containerID="cri-o://789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46" gracePeriod=15 Apr 20 19:11:40.250099 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.250074 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c99c7b9fd-kzgx5_342fce42-7418-4e5d-90d8-e8160f8706f1/console/0.log" Apr 20 19:11:40.250197 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.250131 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:11:40.317466 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317377 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-oauth-config\") pod \"342fce42-7418-4e5d-90d8-e8160f8706f1\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " Apr 20 19:11:40.317466 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317426 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-oauth-serving-cert\") pod \"342fce42-7418-4e5d-90d8-e8160f8706f1\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " Apr 20 19:11:40.317680 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317469 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-service-ca\") pod \"342fce42-7418-4e5d-90d8-e8160f8706f1\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " Apr 20 19:11:40.317680 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317540 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-serving-cert\") pod \"342fce42-7418-4e5d-90d8-e8160f8706f1\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " Apr 20 19:11:40.317680 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317566 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-console-config\") pod \"342fce42-7418-4e5d-90d8-e8160f8706f1\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " Apr 20 19:11:40.317680 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317594 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-864gq\" (UniqueName: \"kubernetes.io/projected/342fce42-7418-4e5d-90d8-e8160f8706f1-kube-api-access-864gq\") pod \"342fce42-7418-4e5d-90d8-e8160f8706f1\" (UID: \"342fce42-7418-4e5d-90d8-e8160f8706f1\") " Apr 20 19:11:40.317972 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317934 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "342fce42-7418-4e5d-90d8-e8160f8706f1" (UID: "342fce42-7418-4e5d-90d8-e8160f8706f1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:40.318090 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317974 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-console-config" (OuterVolumeSpecName: "console-config") pod "342fce42-7418-4e5d-90d8-e8160f8706f1" (UID: "342fce42-7418-4e5d-90d8-e8160f8706f1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:40.318090 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.317982 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-service-ca" (OuterVolumeSpecName: "service-ca") pod "342fce42-7418-4e5d-90d8-e8160f8706f1" (UID: "342fce42-7418-4e5d-90d8-e8160f8706f1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:40.319764 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.319740 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/342fce42-7418-4e5d-90d8-e8160f8706f1-kube-api-access-864gq" (OuterVolumeSpecName: "kube-api-access-864gq") pod "342fce42-7418-4e5d-90d8-e8160f8706f1" (UID: "342fce42-7418-4e5d-90d8-e8160f8706f1"). InnerVolumeSpecName "kube-api-access-864gq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:11:40.319907 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.319891 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "342fce42-7418-4e5d-90d8-e8160f8706f1" (UID: "342fce42-7418-4e5d-90d8-e8160f8706f1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:11:40.319989 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.319970 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "342fce42-7418-4e5d-90d8-e8160f8706f1" (UID: "342fce42-7418-4e5d-90d8-e8160f8706f1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:11:40.419147 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.419118 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:40.419147 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.419142 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-console-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:40.419147 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.419152 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-864gq\" (UniqueName: \"kubernetes.io/projected/342fce42-7418-4e5d-90d8-e8160f8706f1-kube-api-access-864gq\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:40.419361 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.419161 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/342fce42-7418-4e5d-90d8-e8160f8706f1-console-oauth-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:40.419361 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.419170 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-oauth-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:40.419361 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:40.419180 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/342fce42-7418-4e5d-90d8-e8160f8706f1-service-ca\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:11:41.024249 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.024222 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c99c7b9fd-kzgx5_342fce42-7418-4e5d-90d8-e8160f8706f1/console/0.log" Apr 20 19:11:41.024642 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.024264 2583 generic.go:358] "Generic (PLEG): container finished" podID="342fce42-7418-4e5d-90d8-e8160f8706f1" containerID="789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46" exitCode=2 Apr 20 19:11:41.024642 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.024352 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c99c7b9fd-kzgx5" event={"ID":"342fce42-7418-4e5d-90d8-e8160f8706f1","Type":"ContainerDied","Data":"789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46"} Apr 20 19:11:41.024642 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.024371 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c99c7b9fd-kzgx5" Apr 20 19:11:41.024642 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.024391 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c99c7b9fd-kzgx5" event={"ID":"342fce42-7418-4e5d-90d8-e8160f8706f1","Type":"ContainerDied","Data":"04ea77f36c8b2469f4010ea68dee626adf2c959c6ebd15cc1eda73b6c3e866b7"} Apr 20 19:11:41.024642 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.024406 2583 scope.go:117] "RemoveContainer" containerID="789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46" Apr 20 19:11:41.032450 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.032432 2583 scope.go:117] "RemoveContainer" containerID="789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46" Apr 20 19:11:41.032697 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:11:41.032679 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46\": container with ID starting with 789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46 not found: ID does not exist" containerID="789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46" Apr 20 19:11:41.032731 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.032706 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46"} err="failed to get container status \"789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46\": rpc error: code = NotFound desc = could not find container \"789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46\": container with ID starting with 789ca6d6bc6add09b53a99792903b6d7834975e3356402cae74a2adcdc1d9a46 not found: ID does not exist" Apr 20 19:11:41.040750 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.040725 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c99c7b9fd-kzgx5"] Apr 20 19:11:41.044772 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:41.044751 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c99c7b9fd-kzgx5"] Apr 20 19:11:42.360810 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:42.360778 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="342fce42-7418-4e5d-90d8-e8160f8706f1" path="/var/lib/kubelet/pods/342fce42-7418-4e5d-90d8-e8160f8706f1/volumes" Apr 20 19:11:53.057116 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:53.057024 2583 generic.go:358] "Generic (PLEG): container finished" podID="39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6" containerID="8435afa0cb2d5124c1178c401196b9a50585f8e91567a549cc0ec653ca933c9f" exitCode=0 Apr 20 19:11:53.057116 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:53.057103 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" event={"ID":"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6","Type":"ContainerDied","Data":"8435afa0cb2d5124c1178c401196b9a50585f8e91567a549cc0ec653ca933c9f"} Apr 20 19:11:53.057524 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:53.057429 2583 scope.go:117] "RemoveContainer" containerID="8435afa0cb2d5124c1178c401196b9a50585f8e91567a549cc0ec653ca933c9f" Apr 20 19:11:54.061872 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:11:54.061834 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g445s" event={"ID":"39c5c9ae-a244-4bc1-a26c-70bf6a41b8e6","Type":"ContainerStarted","Data":"3aa28609502634f51e51b071a826fff6e66720236453ec90bb7c06d26c189439"} Apr 20 19:12:22.238686 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.238645 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58b547c588-rtw7b"] Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.238946 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adb2dcdb-56f6-44a1-8932-aebb49b373e0" containerName="registry" Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.238959 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb2dcdb-56f6-44a1-8932-aebb49b373e0" containerName="registry" Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.238970 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05575f5f-797a-41a7-8bf7-335d16b08901" containerName="console" Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.238975 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="05575f5f-797a-41a7-8bf7-335d16b08901" containerName="console" Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.238985 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="342fce42-7418-4e5d-90d8-e8160f8706f1" containerName="console" Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.238990 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="342fce42-7418-4e5d-90d8-e8160f8706f1" containerName="console" Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.239046 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="05575f5f-797a-41a7-8bf7-335d16b08901" containerName="console" Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.239055 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="342fce42-7418-4e5d-90d8-e8160f8706f1" containerName="console" Apr 20 19:12:22.239098 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.239062 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="adb2dcdb-56f6-44a1-8932-aebb49b373e0" containerName="registry" Apr 20 19:12:22.243262 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.243243 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.252925 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.252905 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58b547c588-rtw7b"] Apr 20 19:12:22.363106 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.363070 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-serving-cert\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.363282 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.363114 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-console-config\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.363282 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.363158 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6r9w\" (UniqueName: \"kubernetes.io/projected/96f59f1e-15fc-403e-b192-172a795b295e-kube-api-access-g6r9w\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.363282 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.363245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-oauth-config\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.363282 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.363279 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-service-ca\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.363482 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.363302 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-oauth-serving-cert\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.363482 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.363341 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-trusted-ca-bundle\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.463845 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.463786 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6r9w\" (UniqueName: \"kubernetes.io/projected/96f59f1e-15fc-403e-b192-172a795b295e-kube-api-access-g6r9w\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464049 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.463922 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-oauth-config\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464049 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.463950 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-service-ca\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464168 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.464090 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-oauth-serving-cert\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464168 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.464133 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-trusted-ca-bundle\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.464155 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-serving-cert\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.464200 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-console-config\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464756 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.464726 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-service-ca\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464879 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.464729 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-oauth-serving-cert\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.464879 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.464850 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-console-config\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.465158 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.465133 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-trusted-ca-bundle\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.466733 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.466701 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-oauth-config\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.466733 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.466719 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-serving-cert\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.472367 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.472347 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6r9w\" (UniqueName: \"kubernetes.io/projected/96f59f1e-15fc-403e-b192-172a795b295e-kube-api-access-g6r9w\") pod \"console-58b547c588-rtw7b\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.553452 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.553361 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:22.690474 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:22.690449 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58b547c588-rtw7b"] Apr 20 19:12:22.693135 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:12:22.693111 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96f59f1e_15fc_403e_b192_172a795b295e.slice/crio-391a60142990de21409bd7b49c0d5b6dbd5e18fd1d697c9c734a68fa9f3d86e6 WatchSource:0}: Error finding container 391a60142990de21409bd7b49c0d5b6dbd5e18fd1d697c9c734a68fa9f3d86e6: Status 404 returned error can't find the container with id 391a60142990de21409bd7b49c0d5b6dbd5e18fd1d697c9c734a68fa9f3d86e6 Apr 20 19:12:23.143042 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:23.143006 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58b547c588-rtw7b" event={"ID":"96f59f1e-15fc-403e-b192-172a795b295e","Type":"ContainerStarted","Data":"4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8"} Apr 20 19:12:23.143042 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:23.143043 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58b547c588-rtw7b" event={"ID":"96f59f1e-15fc-403e-b192-172a795b295e","Type":"ContainerStarted","Data":"391a60142990de21409bd7b49c0d5b6dbd5e18fd1d697c9c734a68fa9f3d86e6"} Apr 20 19:12:23.165786 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:23.165746 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58b547c588-rtw7b" podStartSLOduration=1.165732004 podStartE2EDuration="1.165732004s" podCreationTimestamp="2026-04-20 19:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:12:23.165021571 +0000 UTC m=+249.397312091" watchObservedRunningTime="2026-04-20 19:12:23.165732004 +0000 UTC m=+249.398022511" Apr 20 19:12:26.195386 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:26.195347 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:12:26.197904 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:26.197881 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/45a7c0b2-25d6-499d-9c36-ca4ace9c7813-metrics-certs\") pod \"network-metrics-daemon-9gbcz\" (UID: \"45a7c0b2-25d6-499d-9c36-ca4ace9c7813\") " pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:12:26.262741 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:26.262715 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bdmvj\"" Apr 20 19:12:26.270856 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:26.270834 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9gbcz" Apr 20 19:12:26.392950 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:26.392920 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9gbcz"] Apr 20 19:12:26.396136 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:12:26.396103 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a7c0b2_25d6_499d_9c36_ca4ace9c7813.slice/crio-f0d1a31527d3c3a2e45dda4882b532894e1160a5c8d7dac4e29e942fc956ce55 WatchSource:0}: Error finding container f0d1a31527d3c3a2e45dda4882b532894e1160a5c8d7dac4e29e942fc956ce55: Status 404 returned error can't find the container with id f0d1a31527d3c3a2e45dda4882b532894e1160a5c8d7dac4e29e942fc956ce55 Apr 20 19:12:27.155805 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:27.155767 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9gbcz" event={"ID":"45a7c0b2-25d6-499d-9c36-ca4ace9c7813","Type":"ContainerStarted","Data":"f0d1a31527d3c3a2e45dda4882b532894e1160a5c8d7dac4e29e942fc956ce55"} Apr 20 19:12:28.160184 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:28.160149 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9gbcz" event={"ID":"45a7c0b2-25d6-499d-9c36-ca4ace9c7813","Type":"ContainerStarted","Data":"6f57f3292fddd983b51159b35cf3a77634baf9346d3249e371a8eee32e0e8021"} Apr 20 19:12:28.160184 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:28.160185 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9gbcz" event={"ID":"45a7c0b2-25d6-499d-9c36-ca4ace9c7813","Type":"ContainerStarted","Data":"8604e554fca0939fc1863c3bdc3a8fe1ff909aa7812907f0b03e4c3c17af14d5"} Apr 20 19:12:28.177721 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:28.177676 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9gbcz" podStartSLOduration=253.285765122 podStartE2EDuration="4m14.177663067s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:12:26.398060506 +0000 UTC m=+252.630351007" lastFinishedPulling="2026-04-20 19:12:27.289958466 +0000 UTC m=+253.522248952" observedRunningTime="2026-04-20 19:12:28.17607863 +0000 UTC m=+254.408369150" watchObservedRunningTime="2026-04-20 19:12:28.177663067 +0000 UTC m=+254.409953575" Apr 20 19:12:32.554472 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:32.554370 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:32.554472 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:32.554444 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:32.559381 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:32.559353 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:33.178898 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:33.178872 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:12:33.233534 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:33.233502 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-778b785d57-zz64k"] Apr 20 19:12:58.256430 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.256364 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-778b785d57-zz64k" podUID="4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" containerName="console" containerID="cri-o://09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f" gracePeriod=15 Apr 20 19:12:58.500910 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.500886 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-778b785d57-zz64k_4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb/console/0.log" Apr 20 19:12:58.501027 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.500946 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:12:58.631428 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.631397 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxp22\" (UniqueName: \"kubernetes.io/projected/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-kube-api-access-gxp22\") pod \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " Apr 20 19:12:58.631613 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.631458 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-oauth-config\") pod \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " Apr 20 19:12:58.631613 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.631494 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-trusted-ca-bundle\") pod \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " Apr 20 19:12:58.631613 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.631521 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-oauth-serving-cert\") pod \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " Apr 20 19:12:58.631613 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.631546 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-serving-cert\") pod \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " Apr 20 19:12:58.631613 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.631579 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-service-ca\") pod \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " Apr 20 19:12:58.631871 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.631626 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-config\") pod \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\" (UID: \"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb\") " Apr 20 19:12:58.632117 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.632072 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" (UID: "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:58.632117 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.632080 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" (UID: "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:58.632284 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.632193 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" (UID: "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:58.632408 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.632325 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-config" (OuterVolumeSpecName: "console-config") pod "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" (UID: "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:58.633833 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.633811 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" (UID: "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:58.634288 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.634269 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-kube-api-access-gxp22" (OuterVolumeSpecName: "kube-api-access-gxp22") pod "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" (UID: "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb"). InnerVolumeSpecName "kube-api-access-gxp22". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:12:58.634288 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.634272 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" (UID: "4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:58.732743 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.732706 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxp22\" (UniqueName: \"kubernetes.io/projected/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-kube-api-access-gxp22\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:12:58.732743 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.732737 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-oauth-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:12:58.732743 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.732746 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-trusted-ca-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:12:58.732743 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.732755 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-oauth-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:12:58.733007 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.732765 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:12:58.733007 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.732774 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-service-ca\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:12:58.733007 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:58.732785 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb-console-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:12:59.247957 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.247929 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-778b785d57-zz64k_4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb/console/0.log" Apr 20 19:12:59.248118 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.247970 2583 generic.go:358] "Generic (PLEG): container finished" podID="4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" containerID="09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f" exitCode=2 Apr 20 19:12:59.248118 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.248039 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-778b785d57-zz64k" Apr 20 19:12:59.248257 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.248039 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778b785d57-zz64k" event={"ID":"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb","Type":"ContainerDied","Data":"09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f"} Apr 20 19:12:59.248257 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.248165 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-778b785d57-zz64k" event={"ID":"4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb","Type":"ContainerDied","Data":"1b90eee737cd0e6377c93753e28a8d1339b3081ff7012cd25c4e7f41ffdc066e"} Apr 20 19:12:59.248257 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.248195 2583 scope.go:117] "RemoveContainer" containerID="09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f" Apr 20 19:12:59.256725 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.256707 2583 scope.go:117] "RemoveContainer" containerID="09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f" Apr 20 19:12:59.256965 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:12:59.256950 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f\": container with ID starting with 09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f not found: ID does not exist" containerID="09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f" Apr 20 19:12:59.257006 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.256974 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f"} err="failed to get container status \"09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f\": rpc error: code = NotFound desc = could not find container \"09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f\": container with ID starting with 09689e03849ef00419fcacb13baf10263117739ac09f7b294de3ed0826f4538f not found: ID does not exist" Apr 20 19:12:59.270339 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.270302 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-778b785d57-zz64k"] Apr 20 19:12:59.274441 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:12:59.274421 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-778b785d57-zz64k"] Apr 20 19:13:00.360727 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:00.360692 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" path="/var/lib/kubelet/pods/4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb/volumes" Apr 20 19:13:14.234150 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:14.234122 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:13:14.236690 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:14.236669 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:13:14.245399 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:14.245373 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 19:13:43.523790 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.523759 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d7f8b87c9-9vhcs"] Apr 20 19:13:43.525929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.524032 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" containerName="console" Apr 20 19:13:43.525929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.524044 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" containerName="console" Apr 20 19:13:43.525929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.524098 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a1758b3-5e5c-4cc5-b7a4-c4ac7eb5a6eb" containerName="console" Apr 20 19:13:43.526817 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.526790 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.537565 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.537538 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d7f8b87c9-9vhcs"] Apr 20 19:13:43.685914 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.685858 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-config\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.685914 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.685909 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-service-ca\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.685914 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.685927 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-trusted-ca-bundle\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.686154 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.685943 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-oauth-serving-cert\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.686154 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.686058 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-serving-cert\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.686154 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.686095 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzkrm\" (UniqueName: \"kubernetes.io/projected/d17a345d-1269-44bf-93d8-04f0d3d0ded5-kube-api-access-nzkrm\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.686154 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.686124 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-oauth-config\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.787460 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.787364 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-serving-cert\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.787460 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.787434 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzkrm\" (UniqueName: \"kubernetes.io/projected/d17a345d-1269-44bf-93d8-04f0d3d0ded5-kube-api-access-nzkrm\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.787460 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.787466 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-oauth-config\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.787697 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.787508 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-config\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.787697 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.787536 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-service-ca\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.787697 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.787561 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-trusted-ca-bundle\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.787697 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.787583 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-oauth-serving-cert\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.788330 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.788284 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-oauth-serving-cert\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.788522 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.788430 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-config\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.788584 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.788550 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-service-ca\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.788641 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.788590 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-trusted-ca-bundle\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.790177 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.790153 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-oauth-config\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.790264 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.790186 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-serving-cert\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.797214 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.797188 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzkrm\" (UniqueName: \"kubernetes.io/projected/d17a345d-1269-44bf-93d8-04f0d3d0ded5-kube-api-access-nzkrm\") pod \"console-d7f8b87c9-9vhcs\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.838156 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.838123 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:43.970972 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.970944 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d7f8b87c9-9vhcs"] Apr 20 19:13:43.973649 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:13:43.973608 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd17a345d_1269_44bf_93d8_04f0d3d0ded5.slice/crio-885ee1e151a06ee6fd67db9c104a060aca3e174811d7b73ed8cb195ac6fd894a WatchSource:0}: Error finding container 885ee1e151a06ee6fd67db9c104a060aca3e174811d7b73ed8cb195ac6fd894a: Status 404 returned error can't find the container with id 885ee1e151a06ee6fd67db9c104a060aca3e174811d7b73ed8cb195ac6fd894a Apr 20 19:13:43.975706 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:43.975691 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:13:44.371513 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:44.371482 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d7f8b87c9-9vhcs" event={"ID":"d17a345d-1269-44bf-93d8-04f0d3d0ded5","Type":"ContainerStarted","Data":"e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a"} Apr 20 19:13:44.371513 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:44.371515 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d7f8b87c9-9vhcs" event={"ID":"d17a345d-1269-44bf-93d8-04f0d3d0ded5","Type":"ContainerStarted","Data":"885ee1e151a06ee6fd67db9c104a060aca3e174811d7b73ed8cb195ac6fd894a"} Apr 20 19:13:44.395938 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:44.395888 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d7f8b87c9-9vhcs" podStartSLOduration=1.395873905 podStartE2EDuration="1.395873905s" podCreationTimestamp="2026-04-20 19:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:13:44.393675378 +0000 UTC m=+330.625965887" watchObservedRunningTime="2026-04-20 19:13:44.395873905 +0000 UTC m=+330.628164412" Apr 20 19:13:53.839024 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:53.838977 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:53.839514 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:53.839069 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:53.843715 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:53.843695 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:54.405478 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:54.405451 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:13:54.450257 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:13:54.450228 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58b547c588-rtw7b"] Apr 20 19:14:19.473697 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.473598 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58b547c588-rtw7b" podUID="96f59f1e-15fc-403e-b192-172a795b295e" containerName="console" containerID="cri-o://4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8" gracePeriod=15 Apr 20 19:14:19.710645 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.710622 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58b547c588-rtw7b_96f59f1e-15fc-403e-b192-172a795b295e/console/0.log" Apr 20 19:14:19.710757 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.710683 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:14:19.748520 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748445 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-console-config\") pod \"96f59f1e-15fc-403e-b192-172a795b295e\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " Apr 20 19:14:19.748520 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748488 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-trusted-ca-bundle\") pod \"96f59f1e-15fc-403e-b192-172a795b295e\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " Apr 20 19:14:19.748520 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748506 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-oauth-serving-cert\") pod \"96f59f1e-15fc-403e-b192-172a795b295e\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " Apr 20 19:14:19.748758 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748550 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6r9w\" (UniqueName: \"kubernetes.io/projected/96f59f1e-15fc-403e-b192-172a795b295e-kube-api-access-g6r9w\") pod \"96f59f1e-15fc-403e-b192-172a795b295e\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " Apr 20 19:14:19.748758 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748578 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-serving-cert\") pod \"96f59f1e-15fc-403e-b192-172a795b295e\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " Apr 20 19:14:19.748758 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748624 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-oauth-config\") pod \"96f59f1e-15fc-403e-b192-172a795b295e\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " Apr 20 19:14:19.748758 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748652 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-service-ca\") pod \"96f59f1e-15fc-403e-b192-172a795b295e\" (UID: \"96f59f1e-15fc-403e-b192-172a795b295e\") " Apr 20 19:14:19.748957 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748848 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-console-config" (OuterVolumeSpecName: "console-config") pod "96f59f1e-15fc-403e-b192-172a795b295e" (UID: "96f59f1e-15fc-403e-b192-172a795b295e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:14:19.748957 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748898 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "96f59f1e-15fc-403e-b192-172a795b295e" (UID: "96f59f1e-15fc-403e-b192-172a795b295e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:14:19.749075 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.748954 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96f59f1e-15fc-403e-b192-172a795b295e" (UID: "96f59f1e-15fc-403e-b192-172a795b295e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:14:19.749130 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.749099 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-service-ca" (OuterVolumeSpecName: "service-ca") pod "96f59f1e-15fc-403e-b192-172a795b295e" (UID: "96f59f1e-15fc-403e-b192-172a795b295e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:14:19.750881 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.750854 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96f59f1e-15fc-403e-b192-172a795b295e" (UID: "96f59f1e-15fc-403e-b192-172a795b295e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:14:19.750988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.750909 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96f59f1e-15fc-403e-b192-172a795b295e" (UID: "96f59f1e-15fc-403e-b192-172a795b295e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:14:19.750988 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.750968 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f59f1e-15fc-403e-b192-172a795b295e-kube-api-access-g6r9w" (OuterVolumeSpecName: "kube-api-access-g6r9w") pod "96f59f1e-15fc-403e-b192-172a795b295e" (UID: "96f59f1e-15fc-403e-b192-172a795b295e"). InnerVolumeSpecName "kube-api-access-g6r9w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:14:19.849552 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.849518 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g6r9w\" (UniqueName: \"kubernetes.io/projected/96f59f1e-15fc-403e-b192-172a795b295e-kube-api-access-g6r9w\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:14:19.849552 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.849549 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:14:19.849552 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.849558 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96f59f1e-15fc-403e-b192-172a795b295e-console-oauth-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:14:19.849769 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.849568 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-service-ca\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:14:19.849769 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.849577 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-console-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:14:19.849769 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.849585 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-trusted-ca-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:14:19.849769 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:19.849593 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96f59f1e-15fc-403e-b192-172a795b295e-oauth-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:14:20.471953 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.471927 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58b547c588-rtw7b_96f59f1e-15fc-403e-b192-172a795b295e/console/0.log" Apr 20 19:14:20.472133 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.471968 2583 generic.go:358] "Generic (PLEG): container finished" podID="96f59f1e-15fc-403e-b192-172a795b295e" containerID="4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8" exitCode=2 Apr 20 19:14:20.472133 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.472033 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58b547c588-rtw7b" Apr 20 19:14:20.472133 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.472048 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58b547c588-rtw7b" event={"ID":"96f59f1e-15fc-403e-b192-172a795b295e","Type":"ContainerDied","Data":"4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8"} Apr 20 19:14:20.472133 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.472080 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58b547c588-rtw7b" event={"ID":"96f59f1e-15fc-403e-b192-172a795b295e","Type":"ContainerDied","Data":"391a60142990de21409bd7b49c0d5b6dbd5e18fd1d697c9c734a68fa9f3d86e6"} Apr 20 19:14:20.472133 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.472096 2583 scope.go:117] "RemoveContainer" containerID="4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8" Apr 20 19:14:20.480166 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.479969 2583 scope.go:117] "RemoveContainer" containerID="4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8" Apr 20 19:14:20.480450 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:14:20.480256 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8\": container with ID starting with 4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8 not found: ID does not exist" containerID="4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8" Apr 20 19:14:20.480450 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.480280 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8"} err="failed to get container status \"4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8\": rpc error: code = NotFound desc = could not find container \"4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8\": container with ID starting with 4b1607c11d70e4fb695c274e6836c25968ac8f63ed477ac811668aa7a841d8a8 not found: ID does not exist" Apr 20 19:14:20.490841 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.490819 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58b547c588-rtw7b"] Apr 20 19:14:20.494896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:20.494877 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58b547c588-rtw7b"] Apr 20 19:14:22.344358 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.344321 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rrw7f"] Apr 20 19:14:22.344798 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.344601 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96f59f1e-15fc-403e-b192-172a795b295e" containerName="console" Apr 20 19:14:22.344798 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.344611 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f59f1e-15fc-403e-b192-172a795b295e" containerName="console" Apr 20 19:14:22.344798 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.344658 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="96f59f1e-15fc-403e-b192-172a795b295e" containerName="console" Apr 20 19:14:22.348696 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.348676 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.351992 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.351964 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:14:22.361657 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.361634 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f59f1e-15fc-403e-b192-172a795b295e" path="/var/lib/kubelet/pods/96f59f1e-15fc-403e-b192-172a795b295e/volumes" Apr 20 19:14:22.361880 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.361867 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rrw7f"] Apr 20 19:14:22.470420 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.470378 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d447757f-829f-4f8a-8032-c70450b9f31f-kubelet-config\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.470612 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.470495 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d447757f-829f-4f8a-8032-c70450b9f31f-dbus\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.470612 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.470525 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d447757f-829f-4f8a-8032-c70450b9f31f-original-pull-secret\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.571108 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.571068 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d447757f-829f-4f8a-8032-c70450b9f31f-dbus\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.571108 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.571106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d447757f-829f-4f8a-8032-c70450b9f31f-original-pull-secret\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.571337 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.571133 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d447757f-829f-4f8a-8032-c70450b9f31f-kubelet-config\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.571337 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.571209 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d447757f-829f-4f8a-8032-c70450b9f31f-kubelet-config\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.571337 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.571284 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d447757f-829f-4f8a-8032-c70450b9f31f-dbus\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.573562 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.573534 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d447757f-829f-4f8a-8032-c70450b9f31f-original-pull-secret\") pod \"global-pull-secret-syncer-rrw7f\" (UID: \"d447757f-829f-4f8a-8032-c70450b9f31f\") " pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.657939 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.657858 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rrw7f" Apr 20 19:14:22.780117 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:22.780090 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rrw7f"] Apr 20 19:14:22.782476 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:14:22.782450 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd447757f_829f_4f8a_8032_c70450b9f31f.slice/crio-c741c39cb30752b74b9d85a53d577f884dbf71c5199f6e40726e4b61d3ab0b41 WatchSource:0}: Error finding container c741c39cb30752b74b9d85a53d577f884dbf71c5199f6e40726e4b61d3ab0b41: Status 404 returned error can't find the container with id c741c39cb30752b74b9d85a53d577f884dbf71c5199f6e40726e4b61d3ab0b41 Apr 20 19:14:23.481973 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:23.481938 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rrw7f" event={"ID":"d447757f-829f-4f8a-8032-c70450b9f31f","Type":"ContainerStarted","Data":"c741c39cb30752b74b9d85a53d577f884dbf71c5199f6e40726e4b61d3ab0b41"} Apr 20 19:14:28.497365 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:28.497333 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rrw7f" event={"ID":"d447757f-829f-4f8a-8032-c70450b9f31f","Type":"ContainerStarted","Data":"b4a7cea454fedf226f8967a3b4b071930dc65aff66bf3424b7f36ed01f87d67d"} Apr 20 19:14:28.514029 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:14:28.513984 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rrw7f" podStartSLOduration=1.565827493 podStartE2EDuration="6.513968607s" podCreationTimestamp="2026-04-20 19:14:22 +0000 UTC" firstStartedPulling="2026-04-20 19:14:22.78402544 +0000 UTC m=+369.016315926" lastFinishedPulling="2026-04-20 19:14:27.73216655 +0000 UTC m=+373.964457040" observedRunningTime="2026-04-20 19:14:28.513027286 +0000 UTC m=+374.745317793" watchObservedRunningTime="2026-04-20 19:14:28.513968607 +0000 UTC m=+374.746259115" Apr 20 19:15:00.781258 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.781219 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9"] Apr 20 19:15:00.785358 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.785341 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.788406 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.788378 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:15:00.788537 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.788379 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6qnrr\"" Apr 20 19:15:00.789485 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.789470 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:15:00.793993 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.793970 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9"] Apr 20 19:15:00.873725 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.873694 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxqgq\" (UniqueName: \"kubernetes.io/projected/eb6a209f-3252-45d4-b55c-79e0d7859cb1-kube-api-access-lxqgq\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.873934 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.873739 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.873934 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.873811 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.974890 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.974858 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.975013 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.974928 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.975013 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.974969 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxqgq\" (UniqueName: \"kubernetes.io/projected/eb6a209f-3252-45d4-b55c-79e0d7859cb1-kube-api-access-lxqgq\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.975293 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.975273 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.975350 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.975290 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:00.984662 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:00.984631 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxqgq\" (UniqueName: \"kubernetes.io/projected/eb6a209f-3252-45d4-b55c-79e0d7859cb1-kube-api-access-lxqgq\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:01.095720 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:01.095688 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:01.221280 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:01.221246 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9"] Apr 20 19:15:01.225004 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:15:01.224979 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb6a209f_3252_45d4_b55c_79e0d7859cb1.slice/crio-91321a594aadb27f0a924edba071f4d3e948a1db5a5d81d1bc7d1b01725ed108 WatchSource:0}: Error finding container 91321a594aadb27f0a924edba071f4d3e948a1db5a5d81d1bc7d1b01725ed108: Status 404 returned error can't find the container with id 91321a594aadb27f0a924edba071f4d3e948a1db5a5d81d1bc7d1b01725ed108 Apr 20 19:15:01.596078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:01.596039 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" event={"ID":"eb6a209f-3252-45d4-b55c-79e0d7859cb1","Type":"ContainerStarted","Data":"91321a594aadb27f0a924edba071f4d3e948a1db5a5d81d1bc7d1b01725ed108"} Apr 20 19:15:07.615486 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:07.615446 2583 generic.go:358] "Generic (PLEG): container finished" podID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerID="735392d15970cf63fe7eca21e14064f062596272c6bbdb0963d2615bdf780339" exitCode=0 Apr 20 19:15:07.615869 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:07.615518 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" event={"ID":"eb6a209f-3252-45d4-b55c-79e0d7859cb1","Type":"ContainerDied","Data":"735392d15970cf63fe7eca21e14064f062596272c6bbdb0963d2615bdf780339"} Apr 20 19:15:09.622269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:09.622239 2583 generic.go:358] "Generic (PLEG): container finished" podID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerID="bfca049de59107245f063e96a4427bfca1d0b26605569f9e5283b9cd8cdbef1d" exitCode=0 Apr 20 19:15:09.622668 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:09.622335 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" event={"ID":"eb6a209f-3252-45d4-b55c-79e0d7859cb1","Type":"ContainerDied","Data":"bfca049de59107245f063e96a4427bfca1d0b26605569f9e5283b9cd8cdbef1d"} Apr 20 19:15:16.647511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:16.647478 2583 generic.go:358] "Generic (PLEG): container finished" podID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerID="36cc0a638413619cb74e1834b702b015c6c899b1dffc7614db6f3e1b3ec303fe" exitCode=0 Apr 20 19:15:16.647878 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:16.647540 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" event={"ID":"eb6a209f-3252-45d4-b55c-79e0d7859cb1","Type":"ContainerDied","Data":"36cc0a638413619cb74e1834b702b015c6c899b1dffc7614db6f3e1b3ec303fe"} Apr 20 19:15:17.773097 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.773074 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:17.812915 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.812887 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxqgq\" (UniqueName: \"kubernetes.io/projected/eb6a209f-3252-45d4-b55c-79e0d7859cb1-kube-api-access-lxqgq\") pod \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " Apr 20 19:15:17.813063 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.812940 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-util\") pod \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " Apr 20 19:15:17.813063 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.813054 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-bundle\") pod \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\" (UID: \"eb6a209f-3252-45d4-b55c-79e0d7859cb1\") " Apr 20 19:15:17.813699 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.813673 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-bundle" (OuterVolumeSpecName: "bundle") pod "eb6a209f-3252-45d4-b55c-79e0d7859cb1" (UID: "eb6a209f-3252-45d4-b55c-79e0d7859cb1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:15:17.815249 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.815229 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6a209f-3252-45d4-b55c-79e0d7859cb1-kube-api-access-lxqgq" (OuterVolumeSpecName: "kube-api-access-lxqgq") pod "eb6a209f-3252-45d4-b55c-79e0d7859cb1" (UID: "eb6a209f-3252-45d4-b55c-79e0d7859cb1"). InnerVolumeSpecName "kube-api-access-lxqgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:15:17.819414 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.819388 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-util" (OuterVolumeSpecName: "util") pod "eb6a209f-3252-45d4-b55c-79e0d7859cb1" (UID: "eb6a209f-3252-45d4-b55c-79e0d7859cb1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:15:17.913778 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.913698 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:17.913778 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.913725 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lxqgq\" (UniqueName: \"kubernetes.io/projected/eb6a209f-3252-45d4-b55c-79e0d7859cb1-kube-api-access-lxqgq\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:17.913778 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:17.913736 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eb6a209f-3252-45d4-b55c-79e0d7859cb1-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:18.654929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:18.654837 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" event={"ID":"eb6a209f-3252-45d4-b55c-79e0d7859cb1","Type":"ContainerDied","Data":"91321a594aadb27f0a924edba071f4d3e948a1db5a5d81d1bc7d1b01725ed108"} Apr 20 19:15:18.654929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:18.654871 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91321a594aadb27f0a924edba071f4d3e948a1db5a5d81d1bc7d1b01725ed108" Apr 20 19:15:18.654929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:18.654880 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dtjgq9" Apr 20 19:15:23.756206 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.756166 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q"] Apr 20 19:15:23.756609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.756484 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerName="util" Apr 20 19:15:23.756609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.756497 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerName="util" Apr 20 19:15:23.756609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.756511 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerName="pull" Apr 20 19:15:23.756609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.756517 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerName="pull" Apr 20 19:15:23.756609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.756526 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerName="extract" Apr 20 19:15:23.756609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.756532 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerName="extract" Apr 20 19:15:23.756609 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.756589 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb6a209f-3252-45d4-b55c-79e0d7859cb1" containerName="extract" Apr 20 19:15:23.805691 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.805654 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q"] Apr 20 19:15:23.805829 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.805773 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" Apr 20 19:15:23.808873 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.808841 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:15:23.808873 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.808856 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-qg6q6\"" Apr 20 19:15:23.809037 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.808879 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 19:15:23.858887 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.858846 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqb84\" (UniqueName: \"kubernetes.io/projected/49fe9c36-eeea-4e89-895c-df6ff01bcf7e-kube-api-access-fqb84\") pod \"cert-manager-operator-controller-manager-54b9655956-jcp5q\" (UID: \"49fe9c36-eeea-4e89-895c-df6ff01bcf7e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" Apr 20 19:15:23.859074 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.858894 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49fe9c36-eeea-4e89-895c-df6ff01bcf7e-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-jcp5q\" (UID: \"49fe9c36-eeea-4e89-895c-df6ff01bcf7e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" Apr 20 19:15:23.959544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.959509 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqb84\" (UniqueName: \"kubernetes.io/projected/49fe9c36-eeea-4e89-895c-df6ff01bcf7e-kube-api-access-fqb84\") pod \"cert-manager-operator-controller-manager-54b9655956-jcp5q\" (UID: \"49fe9c36-eeea-4e89-895c-df6ff01bcf7e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" Apr 20 19:15:23.959544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.959552 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49fe9c36-eeea-4e89-895c-df6ff01bcf7e-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-jcp5q\" (UID: \"49fe9c36-eeea-4e89-895c-df6ff01bcf7e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" Apr 20 19:15:23.959919 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.959903 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49fe9c36-eeea-4e89-895c-df6ff01bcf7e-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-jcp5q\" (UID: \"49fe9c36-eeea-4e89-895c-df6ff01bcf7e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" Apr 20 19:15:23.968498 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:23.968461 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqb84\" (UniqueName: \"kubernetes.io/projected/49fe9c36-eeea-4e89-895c-df6ff01bcf7e-kube-api-access-fqb84\") pod \"cert-manager-operator-controller-manager-54b9655956-jcp5q\" (UID: \"49fe9c36-eeea-4e89-895c-df6ff01bcf7e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" Apr 20 19:15:24.114885 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:24.114848 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" Apr 20 19:15:24.248467 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:24.248439 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q"] Apr 20 19:15:24.251372 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:15:24.251341 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49fe9c36_eeea_4e89_895c_df6ff01bcf7e.slice/crio-996b791113c3dd63986cd684eeaf959ffde6a791e7e0c5e61ea2173e077d6b35 WatchSource:0}: Error finding container 996b791113c3dd63986cd684eeaf959ffde6a791e7e0c5e61ea2173e077d6b35: Status 404 returned error can't find the container with id 996b791113c3dd63986cd684eeaf959ffde6a791e7e0c5e61ea2173e077d6b35 Apr 20 19:15:24.672372 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:24.672334 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" event={"ID":"49fe9c36-eeea-4e89-895c-df6ff01bcf7e","Type":"ContainerStarted","Data":"996b791113c3dd63986cd684eeaf959ffde6a791e7e0c5e61ea2173e077d6b35"} Apr 20 19:15:26.679982 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:26.679939 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" event={"ID":"49fe9c36-eeea-4e89-895c-df6ff01bcf7e","Type":"ContainerStarted","Data":"491bc7986526e4f9ea082b0ca4bfe627e9b53a6b521364ad3a21c00c82013544"} Apr 20 19:15:26.702199 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:26.702139 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-jcp5q" podStartSLOduration=1.701716678 podStartE2EDuration="3.702121109s" podCreationTimestamp="2026-04-20 19:15:23 +0000 UTC" firstStartedPulling="2026-04-20 19:15:24.2546424 +0000 UTC m=+430.486932889" lastFinishedPulling="2026-04-20 19:15:26.25504683 +0000 UTC m=+432.487337320" observedRunningTime="2026-04-20 19:15:26.699488055 +0000 UTC m=+432.931778563" watchObservedRunningTime="2026-04-20 19:15:26.702121109 +0000 UTC m=+432.934411618" Apr 20 19:15:27.740299 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.740263 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq"] Apr 20 19:15:27.764787 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.764744 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq"] Apr 20 19:15:27.764940 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.764911 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:27.767962 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.767939 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6qnrr\"" Apr 20 19:15:27.768082 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.768051 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:15:27.769151 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.769136 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:15:27.890985 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.890950 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:27.891172 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.890995 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqt58\" (UniqueName: \"kubernetes.io/projected/482b273d-a4dd-45ee-b8d8-69322b3c1497-kube-api-access-qqt58\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:27.891172 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.891021 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:27.992080 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.991996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:27.992080 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.992044 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqt58\" (UniqueName: \"kubernetes.io/projected/482b273d-a4dd-45ee-b8d8-69322b3c1497-kube-api-access-qqt58\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:27.992080 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.992077 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:27.992461 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.992443 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:27.992510 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:27.992485 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:28.001216 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:28.001194 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqt58\" (UniqueName: \"kubernetes.io/projected/482b273d-a4dd-45ee-b8d8-69322b3c1497-kube-api-access-qqt58\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:28.074234 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:28.074200 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:28.220152 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:28.220118 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq"] Apr 20 19:15:28.222138 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:15:28.222111 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482b273d_a4dd_45ee_b8d8_69322b3c1497.slice/crio-674284d7ed0e0fb04c3c8055ab015843c29e559b9194da1ea38e74fbd3ed6879 WatchSource:0}: Error finding container 674284d7ed0e0fb04c3c8055ab015843c29e559b9194da1ea38e74fbd3ed6879: Status 404 returned error can't find the container with id 674284d7ed0e0fb04c3c8055ab015843c29e559b9194da1ea38e74fbd3ed6879 Apr 20 19:15:28.687485 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:28.687457 2583 generic.go:358] "Generic (PLEG): container finished" podID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerID="b2dda72b63216cf60115ddac0a95e958916ab16a1b62552b95f5196ed43073d8" exitCode=0 Apr 20 19:15:28.687654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:28.687541 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" event={"ID":"482b273d-a4dd-45ee-b8d8-69322b3c1497","Type":"ContainerDied","Data":"b2dda72b63216cf60115ddac0a95e958916ab16a1b62552b95f5196ed43073d8"} Apr 20 19:15:28.687654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:28.687585 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" event={"ID":"482b273d-a4dd-45ee-b8d8-69322b3c1497","Type":"ContainerStarted","Data":"674284d7ed0e0fb04c3c8055ab015843c29e559b9194da1ea38e74fbd3ed6879"} Apr 20 19:15:30.695752 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:30.695670 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" event={"ID":"482b273d-a4dd-45ee-b8d8-69322b3c1497","Type":"ContainerStarted","Data":"a6ed050b7706fd3504048855948abdff44b5bb359828e3253dccd50728445406"} Apr 20 19:15:31.700264 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:31.700226 2583 generic.go:358] "Generic (PLEG): container finished" podID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerID="a6ed050b7706fd3504048855948abdff44b5bb359828e3253dccd50728445406" exitCode=0 Apr 20 19:15:31.700660 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:31.700269 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" event={"ID":"482b273d-a4dd-45ee-b8d8-69322b3c1497","Type":"ContainerDied","Data":"a6ed050b7706fd3504048855948abdff44b5bb359828e3253dccd50728445406"} Apr 20 19:15:32.704621 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:32.704589 2583 generic.go:358] "Generic (PLEG): container finished" podID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerID="859f70bd059e505c2e0b21561b2721e731511056c59c3cb365aa4567f1b57931" exitCode=0 Apr 20 19:15:32.705008 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:32.704649 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" event={"ID":"482b273d-a4dd-45ee-b8d8-69322b3c1497","Type":"ContainerDied","Data":"859f70bd059e505c2e0b21561b2721e731511056c59c3cb365aa4567f1b57931"} Apr 20 19:15:33.825233 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:33.825205 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:33.943268 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:33.943229 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-util\") pod \"482b273d-a4dd-45ee-b8d8-69322b3c1497\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " Apr 20 19:15:33.943268 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:33.943269 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-bundle\") pod \"482b273d-a4dd-45ee-b8d8-69322b3c1497\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " Apr 20 19:15:33.943529 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:33.943374 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqt58\" (UniqueName: \"kubernetes.io/projected/482b273d-a4dd-45ee-b8d8-69322b3c1497-kube-api-access-qqt58\") pod \"482b273d-a4dd-45ee-b8d8-69322b3c1497\" (UID: \"482b273d-a4dd-45ee-b8d8-69322b3c1497\") " Apr 20 19:15:33.943649 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:33.943626 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-bundle" (OuterVolumeSpecName: "bundle") pod "482b273d-a4dd-45ee-b8d8-69322b3c1497" (UID: "482b273d-a4dd-45ee-b8d8-69322b3c1497"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:15:33.945665 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:33.945640 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482b273d-a4dd-45ee-b8d8-69322b3c1497-kube-api-access-qqt58" (OuterVolumeSpecName: "kube-api-access-qqt58") pod "482b273d-a4dd-45ee-b8d8-69322b3c1497" (UID: "482b273d-a4dd-45ee-b8d8-69322b3c1497"). InnerVolumeSpecName "kube-api-access-qqt58". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:15:33.947649 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:33.947628 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-util" (OuterVolumeSpecName: "util") pod "482b273d-a4dd-45ee-b8d8-69322b3c1497" (UID: "482b273d-a4dd-45ee-b8d8-69322b3c1497"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:15:34.044254 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.044160 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqt58\" (UniqueName: \"kubernetes.io/projected/482b273d-a4dd-45ee-b8d8-69322b3c1497-kube-api-access-qqt58\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:34.044254 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.044206 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:34.044254 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.044218 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/482b273d-a4dd-45ee-b8d8-69322b3c1497-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:34.712356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.712244 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" event={"ID":"482b273d-a4dd-45ee-b8d8-69322b3c1497","Type":"ContainerDied","Data":"674284d7ed0e0fb04c3c8055ab015843c29e559b9194da1ea38e74fbd3ed6879"} Apr 20 19:15:34.712356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.712280 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="674284d7ed0e0fb04c3c8055ab015843c29e559b9194da1ea38e74fbd3ed6879" Apr 20 19:15:34.712356 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.712296 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffvvqq" Apr 20 19:15:34.929900 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.929866 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-wwzj4"] Apr 20 19:15:34.930253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.930138 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerName="pull" Apr 20 19:15:34.930253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.930149 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerName="pull" Apr 20 19:15:34.930253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.930161 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerName="extract" Apr 20 19:15:34.930253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.930167 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerName="extract" Apr 20 19:15:34.930253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.930185 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerName="util" Apr 20 19:15:34.930253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.930190 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerName="util" Apr 20 19:15:34.930253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.930235 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="482b273d-a4dd-45ee-b8d8-69322b3c1497" containerName="extract" Apr 20 19:15:34.932163 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.932141 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:34.934868 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.934834 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 19:15:34.935023 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.934872 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-p85gm\"" Apr 20 19:15:34.935023 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.934851 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 19:15:34.941092 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:34.941072 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-wwzj4"] Apr 20 19:15:35.054473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.054375 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-wwzj4\" (UID: \"fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:35.054634 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.054479 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqzlv\" (UniqueName: \"kubernetes.io/projected/fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030-kube-api-access-zqzlv\") pod \"cert-manager-webhook-587ccfb98-wwzj4\" (UID: \"fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:35.155083 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.155048 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqzlv\" (UniqueName: \"kubernetes.io/projected/fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030-kube-api-access-zqzlv\") pod \"cert-manager-webhook-587ccfb98-wwzj4\" (UID: \"fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:35.155262 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.155146 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-wwzj4\" (UID: \"fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:35.166392 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.166360 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-wwzj4\" (UID: \"fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:35.166618 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.166599 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqzlv\" (UniqueName: \"kubernetes.io/projected/fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030-kube-api-access-zqzlv\") pod \"cert-manager-webhook-587ccfb98-wwzj4\" (UID: \"fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030\") " pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:35.241445 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.241417 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:35.360849 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.360803 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-wwzj4"] Apr 20 19:15:35.362814 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:15:35.362787 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa7f1dc8_e8e5_450f_bdb1_d8c53e6c5030.slice/crio-9f5f0378b0efcb3bdb1a37022263ef39aef29ff133f82d5101fba2601e4cf2d4 WatchSource:0}: Error finding container 9f5f0378b0efcb3bdb1a37022263ef39aef29ff133f82d5101fba2601e4cf2d4: Status 404 returned error can't find the container with id 9f5f0378b0efcb3bdb1a37022263ef39aef29ff133f82d5101fba2601e4cf2d4 Apr 20 19:15:35.716040 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:35.716007 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" event={"ID":"fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030","Type":"ContainerStarted","Data":"9f5f0378b0efcb3bdb1a37022263ef39aef29ff133f82d5101fba2601e4cf2d4"} Apr 20 19:15:37.724417 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:37.724382 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" event={"ID":"fa7f1dc8-e8e5-450f-bdb1-d8c53e6c5030","Type":"ContainerStarted","Data":"c2685bb5f5a511ebae5a552d65b3099f982a5a286703a55f32031d461d4e0a60"} Apr 20 19:15:37.724772 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:37.724465 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:37.742163 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:37.742120 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" podStartSLOduration=1.594868744 podStartE2EDuration="3.742109467s" podCreationTimestamp="2026-04-20 19:15:34 +0000 UTC" firstStartedPulling="2026-04-20 19:15:35.364794992 +0000 UTC m=+441.597085478" lastFinishedPulling="2026-04-20 19:15:37.512035712 +0000 UTC m=+443.744326201" observedRunningTime="2026-04-20 19:15:37.740285066 +0000 UTC m=+443.972575571" watchObservedRunningTime="2026-04-20 19:15:37.742109467 +0000 UTC m=+443.974399974" Apr 20 19:15:40.607432 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.607395 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw"] Apr 20 19:15:40.609501 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.609485 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" Apr 20 19:15:40.612225 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.612197 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:15:40.613513 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.613495 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:15:40.613610 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.613585 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-4dm8l\"" Apr 20 19:15:40.619080 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.619059 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw"] Apr 20 19:15:40.703601 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.703565 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txnjk\" (UniqueName: \"kubernetes.io/projected/1ec94478-e96e-4ecf-b153-39178810d00b-kube-api-access-txnjk\") pod \"openshift-lws-operator-bfc7f696d-ztkbw\" (UID: \"1ec94478-e96e-4ecf-b153-39178810d00b\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" Apr 20 19:15:40.703767 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.703615 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ec94478-e96e-4ecf-b153-39178810d00b-tmp\") pod \"openshift-lws-operator-bfc7f696d-ztkbw\" (UID: \"1ec94478-e96e-4ecf-b153-39178810d00b\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" Apr 20 19:15:40.804494 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.804456 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txnjk\" (UniqueName: \"kubernetes.io/projected/1ec94478-e96e-4ecf-b153-39178810d00b-kube-api-access-txnjk\") pod \"openshift-lws-operator-bfc7f696d-ztkbw\" (UID: \"1ec94478-e96e-4ecf-b153-39178810d00b\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" Apr 20 19:15:40.804644 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.804506 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ec94478-e96e-4ecf-b153-39178810d00b-tmp\") pod \"openshift-lws-operator-bfc7f696d-ztkbw\" (UID: \"1ec94478-e96e-4ecf-b153-39178810d00b\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" Apr 20 19:15:40.804830 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.804815 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ec94478-e96e-4ecf-b153-39178810d00b-tmp\") pod \"openshift-lws-operator-bfc7f696d-ztkbw\" (UID: \"1ec94478-e96e-4ecf-b153-39178810d00b\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" Apr 20 19:15:40.813241 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.813209 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txnjk\" (UniqueName: \"kubernetes.io/projected/1ec94478-e96e-4ecf-b153-39178810d00b-kube-api-access-txnjk\") pod \"openshift-lws-operator-bfc7f696d-ztkbw\" (UID: \"1ec94478-e96e-4ecf-b153-39178810d00b\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" Apr 20 19:15:40.918481 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:40.918381 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" Apr 20 19:15:41.041163 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:41.041138 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw"] Apr 20 19:15:41.043583 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:15:41.043541 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ec94478_e96e_4ecf_b153_39178810d00b.slice/crio-ef19093f77e568b3e7f2292dff1c1087c38e2cd2ff11a2c6d5d35ab8f1539d1f WatchSource:0}: Error finding container ef19093f77e568b3e7f2292dff1c1087c38e2cd2ff11a2c6d5d35ab8f1539d1f: Status 404 returned error can't find the container with id ef19093f77e568b3e7f2292dff1c1087c38e2cd2ff11a2c6d5d35ab8f1539d1f Apr 20 19:15:41.738240 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:41.738198 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" event={"ID":"1ec94478-e96e-4ecf-b153-39178810d00b","Type":"ContainerStarted","Data":"ef19093f77e568b3e7f2292dff1c1087c38e2cd2ff11a2c6d5d35ab8f1539d1f"} Apr 20 19:15:43.730664 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:43.730632 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-wwzj4" Apr 20 19:15:43.745883 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:43.745848 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" event={"ID":"1ec94478-e96e-4ecf-b153-39178810d00b","Type":"ContainerStarted","Data":"85271d0094545d3e95cd7785883871d23043e62cf869cb172aa6697091a070b9"} Apr 20 19:15:43.764682 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:43.764631 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-ztkbw" podStartSLOduration=2.087735473 podStartE2EDuration="3.764613605s" podCreationTimestamp="2026-04-20 19:15:40 +0000 UTC" firstStartedPulling="2026-04-20 19:15:41.045096843 +0000 UTC m=+447.277387331" lastFinishedPulling="2026-04-20 19:15:42.721974977 +0000 UTC m=+448.954265463" observedRunningTime="2026-04-20 19:15:43.763994269 +0000 UTC m=+449.996284777" watchObservedRunningTime="2026-04-20 19:15:43.764613605 +0000 UTC m=+449.996904113" Apr 20 19:15:48.394394 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.394364 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9"] Apr 20 19:15:48.396606 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.396589 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.399051 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.399030 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:15:48.400192 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.400173 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6qnrr\"" Apr 20 19:15:48.400341 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.400212 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:15:48.406817 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.406791 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9"] Apr 20 19:15:48.571181 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.571148 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.571391 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.571196 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrcl\" (UniqueName: \"kubernetes.io/projected/480853bc-a688-49b5-9912-6cfaa998c5fe-kube-api-access-mqrcl\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.571391 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.571263 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.672752 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.672665 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.672752 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.672726 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.672953 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.672757 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrcl\" (UniqueName: \"kubernetes.io/projected/480853bc-a688-49b5-9912-6cfaa998c5fe-kube-api-access-mqrcl\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.673135 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.673109 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.673208 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.673123 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.681424 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.681398 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrcl\" (UniqueName: \"kubernetes.io/projected/480853bc-a688-49b5-9912-6cfaa998c5fe-kube-api-access-mqrcl\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.706552 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.706526 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:48.830499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:48.830464 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9"] Apr 20 19:15:48.832663 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:15:48.832630 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480853bc_a688_49b5_9912_6cfaa998c5fe.slice/crio-2e8639f402618b90c588ee3bb50d5382646b33d166f2754bc6a8f6900d903d47 WatchSource:0}: Error finding container 2e8639f402618b90c588ee3bb50d5382646b33d166f2754bc6a8f6900d903d47: Status 404 returned error can't find the container with id 2e8639f402618b90c588ee3bb50d5382646b33d166f2754bc6a8f6900d903d47 Apr 20 19:15:49.769641 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:49.769609 2583 generic.go:358] "Generic (PLEG): container finished" podID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerID="80700b7f0642ee6fa38064ae391b3e40caeff20901ab4927085aca9650fb115b" exitCode=0 Apr 20 19:15:49.770007 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:49.769696 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" event={"ID":"480853bc-a688-49b5-9912-6cfaa998c5fe","Type":"ContainerDied","Data":"80700b7f0642ee6fa38064ae391b3e40caeff20901ab4927085aca9650fb115b"} Apr 20 19:15:49.770007 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:49.769726 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" event={"ID":"480853bc-a688-49b5-9912-6cfaa998c5fe","Type":"ContainerStarted","Data":"2e8639f402618b90c588ee3bb50d5382646b33d166f2754bc6a8f6900d903d47"} Apr 20 19:15:50.774614 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:50.774515 2583 generic.go:358] "Generic (PLEG): container finished" podID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerID="38a4c66a5d9f65dbff5ee0fbc136655e64c241ebbe451b1675888a7f0a63879f" exitCode=0 Apr 20 19:15:50.775058 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:50.774607 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" event={"ID":"480853bc-a688-49b5-9912-6cfaa998c5fe","Type":"ContainerDied","Data":"38a4c66a5d9f65dbff5ee0fbc136655e64c241ebbe451b1675888a7f0a63879f"} Apr 20 19:15:51.779580 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:51.779543 2583 generic.go:358] "Generic (PLEG): container finished" podID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerID="7dc63641ebd13d3461565d1dc2cf666ee79fa3357383ea8406be925f58ee1f76" exitCode=0 Apr 20 19:15:51.779969 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:51.779594 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" event={"ID":"480853bc-a688-49b5-9912-6cfaa998c5fe","Type":"ContainerDied","Data":"7dc63641ebd13d3461565d1dc2cf666ee79fa3357383ea8406be925f58ee1f76"} Apr 20 19:15:52.905350 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:52.905298 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:53.010633 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.010599 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-util\") pod \"480853bc-a688-49b5-9912-6cfaa998c5fe\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " Apr 20 19:15:53.010807 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.010643 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-bundle\") pod \"480853bc-a688-49b5-9912-6cfaa998c5fe\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " Apr 20 19:15:53.010807 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.010691 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqrcl\" (UniqueName: \"kubernetes.io/projected/480853bc-a688-49b5-9912-6cfaa998c5fe-kube-api-access-mqrcl\") pod \"480853bc-a688-49b5-9912-6cfaa998c5fe\" (UID: \"480853bc-a688-49b5-9912-6cfaa998c5fe\") " Apr 20 19:15:53.011545 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.011438 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-bundle" (OuterVolumeSpecName: "bundle") pod "480853bc-a688-49b5-9912-6cfaa998c5fe" (UID: "480853bc-a688-49b5-9912-6cfaa998c5fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:15:53.012975 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.012949 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480853bc-a688-49b5-9912-6cfaa998c5fe-kube-api-access-mqrcl" (OuterVolumeSpecName: "kube-api-access-mqrcl") pod "480853bc-a688-49b5-9912-6cfaa998c5fe" (UID: "480853bc-a688-49b5-9912-6cfaa998c5fe"). InnerVolumeSpecName "kube-api-access-mqrcl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:15:53.015721 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.015699 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-util" (OuterVolumeSpecName: "util") pod "480853bc-a688-49b5-9912-6cfaa998c5fe" (UID: "480853bc-a688-49b5-9912-6cfaa998c5fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:15:53.111408 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.111376 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqrcl\" (UniqueName: \"kubernetes.io/projected/480853bc-a688-49b5-9912-6cfaa998c5fe-kube-api-access-mqrcl\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:53.111408 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.111407 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:53.111568 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.111417 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/480853bc-a688-49b5-9912-6cfaa998c5fe-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:15:53.788004 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.787969 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" event={"ID":"480853bc-a688-49b5-9912-6cfaa998c5fe","Type":"ContainerDied","Data":"2e8639f402618b90c588ee3bb50d5382646b33d166f2754bc6a8f6900d903d47"} Apr 20 19:15:53.788004 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.788005 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e8639f402618b90c588ee3bb50d5382646b33d166f2754bc6a8f6900d903d47" Apr 20 19:15:53.788234 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:53.787979 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c54qds9" Apr 20 19:15:57.875187 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.875148 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5"] Apr 20 19:15:57.875683 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.875661 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerName="util" Apr 20 19:15:57.875683 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.875680 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerName="util" Apr 20 19:15:57.875797 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.875696 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerName="pull" Apr 20 19:15:57.875797 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.875704 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerName="pull" Apr 20 19:15:57.875797 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.875720 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerName="extract" Apr 20 19:15:57.875797 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.875729 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerName="extract" Apr 20 19:15:57.876057 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.875838 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="480853bc-a688-49b5-9912-6cfaa998c5fe" containerName="extract" Apr 20 19:15:57.878620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.878599 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:57.881401 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.881380 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:15:57.882423 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.882403 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:15:57.882517 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.882426 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6qnrr\"" Apr 20 19:15:57.892595 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.892572 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5"] Apr 20 19:15:57.942620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.942593 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8chvb\" (UniqueName: \"kubernetes.io/projected/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-kube-api-access-8chvb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:57.942779 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.942644 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:57.942779 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:57.942696 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:58.043166 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.043126 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:58.043381 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.043179 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:58.043381 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.043220 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8chvb\" (UniqueName: \"kubernetes.io/projected/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-kube-api-access-8chvb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:58.043579 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.043550 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:58.043654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.043613 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:58.052152 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.052132 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8chvb\" (UniqueName: \"kubernetes.io/projected/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-kube-api-access-8chvb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:58.187855 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.187755 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:15:58.318196 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:15:58.318169 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a0e345e_ceb4_487b_af95_7a8ad858ecfe.slice/crio-e36c5b8672fe983e0f6b731bfcfc77ff652f271c9c857d4d564570c544a34325 WatchSource:0}: Error finding container e36c5b8672fe983e0f6b731bfcfc77ff652f271c9c857d4d564570c544a34325: Status 404 returned error can't find the container with id e36c5b8672fe983e0f6b731bfcfc77ff652f271c9c857d4d564570c544a34325 Apr 20 19:15:58.325795 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.325772 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5"] Apr 20 19:15:58.805515 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.805473 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerID="e3a9e6bfd2d09f47bf3a623ce82585bd11b95b5fcc8b1484f83b4ac4e8dc3111" exitCode=0 Apr 20 19:15:58.805810 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.805537 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" event={"ID":"3a0e345e-ceb4-487b-af95-7a8ad858ecfe","Type":"ContainerDied","Data":"e3a9e6bfd2d09f47bf3a623ce82585bd11b95b5fcc8b1484f83b4ac4e8dc3111"} Apr 20 19:15:58.805810 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:58.805562 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" event={"ID":"3a0e345e-ceb4-487b-af95-7a8ad858ecfe","Type":"ContainerStarted","Data":"e36c5b8672fe983e0f6b731bfcfc77ff652f271c9c857d4d564570c544a34325"} Apr 20 19:15:59.810668 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.810572 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerID="fe1cdc62d077e4004daa450e377374dd3d159bb36942f300dbe10010c6eb0c67" exitCode=0 Apr 20 19:15:59.811028 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.810660 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" event={"ID":"3a0e345e-ceb4-487b-af95-7a8ad858ecfe","Type":"ContainerDied","Data":"fe1cdc62d077e4004daa450e377374dd3d159bb36942f300dbe10010c6eb0c67"} Apr 20 19:15:59.904940 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.904908 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt"] Apr 20 19:15:59.908111 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.908088 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:15:59.912761 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.912738 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:15:59.912904 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.912881 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kdxcc\"" Apr 20 19:15:59.913088 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.913074 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:15:59.914423 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.914404 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:15:59.940903 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.940875 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt"] Apr 20 19:15:59.947516 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.947479 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:15:59.956923 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.956896 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed10f0f0-205a-48fe-9a6e-13bd52000a30-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:15:59.957068 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.956934 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnjht\" (UniqueName: \"kubernetes.io/projected/ed10f0f0-205a-48fe-9a6e-13bd52000a30-kube-api-access-pnjht\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:15:59.957068 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:15:59.956964 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed10f0f0-205a-48fe-9a6e-13bd52000a30-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:00.058036 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.057996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnjht\" (UniqueName: \"kubernetes.io/projected/ed10f0f0-205a-48fe-9a6e-13bd52000a30-kube-api-access-pnjht\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:00.058205 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.058063 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed10f0f0-205a-48fe-9a6e-13bd52000a30-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:00.058205 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.058123 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed10f0f0-205a-48fe-9a6e-13bd52000a30-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:00.060739 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.060682 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ed10f0f0-205a-48fe-9a6e-13bd52000a30-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:00.060739 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.060711 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ed10f0f0-205a-48fe-9a6e-13bd52000a30-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:00.090931 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.090899 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnjht\" (UniqueName: \"kubernetes.io/projected/ed10f0f0-205a-48fe-9a6e-13bd52000a30-kube-api-access-pnjht\") pod \"opendatahub-operator-controller-manager-6c77764cd6-lcfgt\" (UID: \"ed10f0f0-205a-48fe-9a6e-13bd52000a30\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:00.219227 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.219194 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:00.349253 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.349221 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt"] Apr 20 19:16:00.352929 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:16:00.352887 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded10f0f0_205a_48fe_9a6e_13bd52000a30.slice/crio-5875c85c1a14fcbb1fd801d11e17c4219394f8976b6173a3ce51cbeec384b48e WatchSource:0}: Error finding container 5875c85c1a14fcbb1fd801d11e17c4219394f8976b6173a3ce51cbeec384b48e: Status 404 returned error can't find the container with id 5875c85c1a14fcbb1fd801d11e17c4219394f8976b6173a3ce51cbeec384b48e Apr 20 19:16:00.816496 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.816405 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" event={"ID":"ed10f0f0-205a-48fe-9a6e-13bd52000a30","Type":"ContainerStarted","Data":"5875c85c1a14fcbb1fd801d11e17c4219394f8976b6173a3ce51cbeec384b48e"} Apr 20 19:16:00.818694 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.818659 2583 generic.go:358] "Generic (PLEG): container finished" podID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerID="fcdb171b33824279f3fe3faba2191e99dfa78c4cdcd2e4dce3ecc427187521a6" exitCode=0 Apr 20 19:16:00.818839 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.818726 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" event={"ID":"3a0e345e-ceb4-487b-af95-7a8ad858ecfe","Type":"ContainerDied","Data":"fcdb171b33824279f3fe3faba2191e99dfa78c4cdcd2e4dce3ecc427187521a6"} Apr 20 19:16:00.864759 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.864721 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n"] Apr 20 19:16:00.868277 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.868255 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:00.871427 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.871295 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 19:16:00.871427 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.871362 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 19:16:00.871628 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.871611 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 19:16:00.871949 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.871925 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pjvft\"" Apr 20 19:16:00.881122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.881100 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n"] Apr 20 19:16:00.963769 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.963731 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lfw4\" (UniqueName: \"kubernetes.io/projected/919702f8-9e00-4cf6-b1cc-f45966b21a1d-kube-api-access-9lfw4\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:00.963978 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.963811 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/919702f8-9e00-4cf6-b1cc-f45966b21a1d-manager-config\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:00.963978 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.963862 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/919702f8-9e00-4cf6-b1cc-f45966b21a1d-metrics-cert\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:00.963978 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:00.963892 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/919702f8-9e00-4cf6-b1cc-f45966b21a1d-cert\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.064606 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.064568 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfw4\" (UniqueName: \"kubernetes.io/projected/919702f8-9e00-4cf6-b1cc-f45966b21a1d-kube-api-access-9lfw4\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.064800 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.064653 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/919702f8-9e00-4cf6-b1cc-f45966b21a1d-manager-config\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.064800 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.064712 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/919702f8-9e00-4cf6-b1cc-f45966b21a1d-metrics-cert\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.064800 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.064753 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/919702f8-9e00-4cf6-b1cc-f45966b21a1d-cert\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.065657 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.065599 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/919702f8-9e00-4cf6-b1cc-f45966b21a1d-manager-config\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.068081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.068007 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/919702f8-9e00-4cf6-b1cc-f45966b21a1d-metrics-cert\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.068499 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.068476 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/919702f8-9e00-4cf6-b1cc-f45966b21a1d-cert\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.074094 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.074064 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfw4\" (UniqueName: \"kubernetes.io/projected/919702f8-9e00-4cf6-b1cc-f45966b21a1d-kube-api-access-9lfw4\") pod \"lws-controller-manager-fcf468c68-lzc9n\" (UID: \"919702f8-9e00-4cf6-b1cc-f45966b21a1d\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.182559 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.182520 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:01.339357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.339326 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n"] Apr 20 19:16:01.342347 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:16:01.342298 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod919702f8_9e00_4cf6_b1cc_f45966b21a1d.slice/crio-d8aa4e88ad00d520617fedd57284addf8a2023b03d4b22b66add2b5063983bc1 WatchSource:0}: Error finding container d8aa4e88ad00d520617fedd57284addf8a2023b03d4b22b66add2b5063983bc1: Status 404 returned error can't find the container with id d8aa4e88ad00d520617fedd57284addf8a2023b03d4b22b66add2b5063983bc1 Apr 20 19:16:01.824473 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:01.824437 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" event={"ID":"919702f8-9e00-4cf6-b1cc-f45966b21a1d","Type":"ContainerStarted","Data":"d8aa4e88ad00d520617fedd57284addf8a2023b03d4b22b66add2b5063983bc1"} Apr 20 19:16:02.196670 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.196645 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:16:02.272193 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.272159 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8chvb\" (UniqueName: \"kubernetes.io/projected/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-kube-api-access-8chvb\") pod \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " Apr 20 19:16:02.272371 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.272207 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-bundle\") pod \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " Apr 20 19:16:02.273731 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.273697 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-bundle" (OuterVolumeSpecName: "bundle") pod "3a0e345e-ceb4-487b-af95-7a8ad858ecfe" (UID: "3a0e345e-ceb4-487b-af95-7a8ad858ecfe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:16:02.274867 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.274838 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-kube-api-access-8chvb" (OuterVolumeSpecName: "kube-api-access-8chvb") pod "3a0e345e-ceb4-487b-af95-7a8ad858ecfe" (UID: "3a0e345e-ceb4-487b-af95-7a8ad858ecfe"). InnerVolumeSpecName "kube-api-access-8chvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:16:02.372732 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.372701 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-util\") pod \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\" (UID: \"3a0e345e-ceb4-487b-af95-7a8ad858ecfe\") " Apr 20 19:16:02.373175 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.373123 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8chvb\" (UniqueName: \"kubernetes.io/projected/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-kube-api-access-8chvb\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:02.373175 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.373153 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:02.382343 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.380975 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-util" (OuterVolumeSpecName: "util") pod "3a0e345e-ceb4-487b-af95-7a8ad858ecfe" (UID: "3a0e345e-ceb4-487b-af95-7a8ad858ecfe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:16:02.473694 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.473615 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a0e345e-ceb4-487b-af95-7a8ad858ecfe-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:02.829811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.829784 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" Apr 20 19:16:02.829811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.829790 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9fzvk5" event={"ID":"3a0e345e-ceb4-487b-af95-7a8ad858ecfe","Type":"ContainerDied","Data":"e36c5b8672fe983e0f6b731bfcfc77ff652f271c9c857d4d564570c544a34325"} Apr 20 19:16:02.830333 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:02.829824 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e36c5b8672fe983e0f6b731bfcfc77ff652f271c9c857d4d564570c544a34325" Apr 20 19:16:03.835277 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:03.835238 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" event={"ID":"ed10f0f0-205a-48fe-9a6e-13bd52000a30","Type":"ContainerStarted","Data":"2534ab813c6444c218e2a300d42583e3a902b9d17471834a5c8904707b53081c"} Apr 20 19:16:03.835723 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:03.835462 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:03.836589 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:03.836568 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" event={"ID":"919702f8-9e00-4cf6-b1cc-f45966b21a1d","Type":"ContainerStarted","Data":"dc1fb9cbeb40b18dfcc245b169582addc9d561af43c1a76e5f0c8cd7b05985e7"} Apr 20 19:16:03.836679 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:03.836646 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:03.862740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:03.862652 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" podStartSLOduration=2.312644945 podStartE2EDuration="4.862638633s" podCreationTimestamp="2026-04-20 19:15:59 +0000 UTC" firstStartedPulling="2026-04-20 19:16:00.355800632 +0000 UTC m=+466.588091118" lastFinishedPulling="2026-04-20 19:16:02.905794317 +0000 UTC m=+469.138084806" observedRunningTime="2026-04-20 19:16:03.858692555 +0000 UTC m=+470.090983054" watchObservedRunningTime="2026-04-20 19:16:03.862638633 +0000 UTC m=+470.094929141" Apr 20 19:16:03.877923 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:03.877873 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" podStartSLOduration=1.5839115160000001 podStartE2EDuration="3.877857671s" podCreationTimestamp="2026-04-20 19:16:00 +0000 UTC" firstStartedPulling="2026-04-20 19:16:01.344459079 +0000 UTC m=+467.576749567" lastFinishedPulling="2026-04-20 19:16:03.638405233 +0000 UTC m=+469.870695722" observedRunningTime="2026-04-20 19:16:03.87680329 +0000 UTC m=+470.109093797" watchObservedRunningTime="2026-04-20 19:16:03.877857671 +0000 UTC m=+470.110148233" Apr 20 19:16:14.842646 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:14.842612 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-lcfgt" Apr 20 19:16:14.843168 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:14.842753 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-lzc9n" Apr 20 19:16:27.657163 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.657124 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh"] Apr 20 19:16:27.657571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.657505 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerName="extract" Apr 20 19:16:27.657571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.657518 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerName="extract" Apr 20 19:16:27.657571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.657533 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerName="util" Apr 20 19:16:27.657571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.657538 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerName="util" Apr 20 19:16:27.657571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.657553 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerName="pull" Apr 20 19:16:27.657571 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.657559 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerName="pull" Apr 20 19:16:27.657758 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.657613 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a0e345e-ceb4-487b-af95-7a8ad858ecfe" containerName="extract" Apr 20 19:16:27.663249 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.663230 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.665909 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.665886 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:16:27.665909 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.665896 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6qnrr\"" Apr 20 19:16:27.666881 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.666865 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:16:27.669544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.669524 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh"] Apr 20 19:16:27.762111 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.762071 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzsq7\" (UniqueName: \"kubernetes.io/projected/68417e5a-932f-4f7a-b96e-4447b4e51606-kube-api-access-gzsq7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.762111 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.762114 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.762354 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.762142 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.863537 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.863500 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzsq7\" (UniqueName: \"kubernetes.io/projected/68417e5a-932f-4f7a-b96e-4447b4e51606-kube-api-access-gzsq7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.863747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.863549 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.863747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.863715 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.863896 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.863879 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.864032 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.864015 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.873351 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.873321 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzsq7\" (UniqueName: \"kubernetes.io/projected/68417e5a-932f-4f7a-b96e-4447b4e51606-kube-api-access-gzsq7\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:27.973967 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:27.973888 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:28.102274 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:28.102242 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh"] Apr 20 19:16:28.105301 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:16:28.105275 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68417e5a_932f_4f7a_b96e_4447b4e51606.slice/crio-4cbf3b39fc0c48e6e3f2e9a11352113a5e422265bc776b6315ea5e877ff678b7 WatchSource:0}: Error finding container 4cbf3b39fc0c48e6e3f2e9a11352113a5e422265bc776b6315ea5e877ff678b7: Status 404 returned error can't find the container with id 4cbf3b39fc0c48e6e3f2e9a11352113a5e422265bc776b6315ea5e877ff678b7 Apr 20 19:16:28.925451 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:28.925414 2583 generic.go:358] "Generic (PLEG): container finished" podID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerID="5fd5d72d2e79add9136442ea234a59ae4294215763a67cd72476b11357f38bde" exitCode=0 Apr 20 19:16:28.925866 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:28.925507 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" event={"ID":"68417e5a-932f-4f7a-b96e-4447b4e51606","Type":"ContainerDied","Data":"5fd5d72d2e79add9136442ea234a59ae4294215763a67cd72476b11357f38bde"} Apr 20 19:16:28.925866 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:28.925545 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" event={"ID":"68417e5a-932f-4f7a-b96e-4447b4e51606","Type":"ContainerStarted","Data":"4cbf3b39fc0c48e6e3f2e9a11352113a5e422265bc776b6315ea5e877ff678b7"} Apr 20 19:16:29.930109 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:29.930024 2583 generic.go:358] "Generic (PLEG): container finished" podID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerID="df8f68e9d08b172fa7063e38b3fe2fc404d0595d200cd65749211986fe7b520b" exitCode=0 Apr 20 19:16:29.930109 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:29.930076 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" event={"ID":"68417e5a-932f-4f7a-b96e-4447b4e51606","Type":"ContainerDied","Data":"df8f68e9d08b172fa7063e38b3fe2fc404d0595d200cd65749211986fe7b520b"} Apr 20 19:16:30.935813 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:30.935770 2583 generic.go:358] "Generic (PLEG): container finished" podID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerID="c7d9108aa83b66a5c6d6730062c55da334678585e7acc59fbe7b3f4f0e4c9873" exitCode=0 Apr 20 19:16:30.936179 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:30.935851 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" event={"ID":"68417e5a-932f-4f7a-b96e-4447b4e51606","Type":"ContainerDied","Data":"c7d9108aa83b66a5c6d6730062c55da334678585e7acc59fbe7b3f4f0e4c9873"} Apr 20 19:16:32.058331 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.058288 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:32.097904 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.097875 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-bundle\") pod \"68417e5a-932f-4f7a-b96e-4447b4e51606\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " Apr 20 19:16:32.098060 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.097942 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-util\") pod \"68417e5a-932f-4f7a-b96e-4447b4e51606\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " Apr 20 19:16:32.098060 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.097981 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzsq7\" (UniqueName: \"kubernetes.io/projected/68417e5a-932f-4f7a-b96e-4447b4e51606-kube-api-access-gzsq7\") pod \"68417e5a-932f-4f7a-b96e-4447b4e51606\" (UID: \"68417e5a-932f-4f7a-b96e-4447b4e51606\") " Apr 20 19:16:32.102324 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.099522 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-bundle" (OuterVolumeSpecName: "bundle") pod "68417e5a-932f-4f7a-b96e-4447b4e51606" (UID: "68417e5a-932f-4f7a-b96e-4447b4e51606"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:16:32.105794 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.105762 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68417e5a-932f-4f7a-b96e-4447b4e51606-kube-api-access-gzsq7" (OuterVolumeSpecName: "kube-api-access-gzsq7") pod "68417e5a-932f-4f7a-b96e-4447b4e51606" (UID: "68417e5a-932f-4f7a-b96e-4447b4e51606"). InnerVolumeSpecName "kube-api-access-gzsq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:16:32.106224 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.106199 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-util" (OuterVolumeSpecName: "util") pod "68417e5a-932f-4f7a-b96e-4447b4e51606" (UID: "68417e5a-932f-4f7a-b96e-4447b4e51606"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:16:32.198869 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.198780 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:32.198869 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.198814 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68417e5a-932f-4f7a-b96e-4447b4e51606-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:32.198869 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.198824 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gzsq7\" (UniqueName: \"kubernetes.io/projected/68417e5a-932f-4f7a-b96e-4447b4e51606-kube-api-access-gzsq7\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:32.944512 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.944465 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" event={"ID":"68417e5a-932f-4f7a-b96e-4447b4e51606","Type":"ContainerDied","Data":"4cbf3b39fc0c48e6e3f2e9a11352113a5e422265bc776b6315ea5e877ff678b7"} Apr 20 19:16:32.944512 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.944508 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cbf3b39fc0c48e6e3f2e9a11352113a5e422265bc776b6315ea5e877ff678b7" Apr 20 19:16:32.944512 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:32.944511 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835682fh" Apr 20 19:16:41.673244 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.673211 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7"] Apr 20 19:16:41.673620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.673535 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerName="pull" Apr 20 19:16:41.673620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.673547 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerName="pull" Apr 20 19:16:41.673620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.673556 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerName="extract" Apr 20 19:16:41.673620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.673561 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerName="extract" Apr 20 19:16:41.673620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.673568 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerName="util" Apr 20 19:16:41.673620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.673573 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerName="util" Apr 20 19:16:41.673620 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.673616 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="68417e5a-932f-4f7a-b96e-4447b4e51606" containerName="extract" Apr 20 19:16:41.679206 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.679189 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.682515 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.682496 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:16:41.682643 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.682626 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-6qnrr\"" Apr 20 19:16:41.683665 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.683646 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:16:41.701575 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.701551 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7"] Apr 20 19:16:41.778257 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.778220 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.778257 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.778254 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.778490 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.778272 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz2qs\" (UniqueName: \"kubernetes.io/projected/b0da5bdd-61df-4f1f-af5c-757e05372336-kube-api-access-fz2qs\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.879329 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.879269 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.879524 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.879350 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.879524 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.879374 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fz2qs\" (UniqueName: \"kubernetes.io/projected/b0da5bdd-61df-4f1f-af5c-757e05372336-kube-api-access-fz2qs\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.879741 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.879718 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.879819 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.879732 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.913351 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.913299 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz2qs\" (UniqueName: \"kubernetes.io/projected/b0da5bdd-61df-4f1f-af5c-757e05372336-kube-api-access-fz2qs\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:41.988381 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:41.988291 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:42.156202 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:16:42.156164 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0da5bdd_61df_4f1f_af5c_757e05372336.slice/crio-c3534115ef201637b59e7091cea5a3d58f6f9d569e51bde66a81388ac62b4457 WatchSource:0}: Error finding container c3534115ef201637b59e7091cea5a3d58f6f9d569e51bde66a81388ac62b4457: Status 404 returned error can't find the container with id c3534115ef201637b59e7091cea5a3d58f6f9d569e51bde66a81388ac62b4457 Apr 20 19:16:42.156676 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:42.156650 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7"] Apr 20 19:16:42.980122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:42.980088 2583 generic.go:358] "Generic (PLEG): container finished" podID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerID="dad3640c9f88dfb4957ca0991956325f9cdc15908d7d5dd0395c734fac999d84" exitCode=0 Apr 20 19:16:42.980556 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:42.980132 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" event={"ID":"b0da5bdd-61df-4f1f-af5c-757e05372336","Type":"ContainerDied","Data":"dad3640c9f88dfb4957ca0991956325f9cdc15908d7d5dd0395c734fac999d84"} Apr 20 19:16:42.980556 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:42.980158 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" event={"ID":"b0da5bdd-61df-4f1f-af5c-757e05372336","Type":"ContainerStarted","Data":"c3534115ef201637b59e7091cea5a3d58f6f9d569e51bde66a81388ac62b4457"} Apr 20 19:16:43.984875 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:43.984836 2583 generic.go:358] "Generic (PLEG): container finished" podID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerID="36b89de0b40badeca18d3c8113561810484e7b92c3a312ee81a29eab3ed46a61" exitCode=0 Apr 20 19:16:43.985289 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:43.984903 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" event={"ID":"b0da5bdd-61df-4f1f-af5c-757e05372336","Type":"ContainerDied","Data":"36b89de0b40badeca18d3c8113561810484e7b92c3a312ee81a29eab3ed46a61"} Apr 20 19:16:44.990065 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:44.990030 2583 generic.go:358] "Generic (PLEG): container finished" podID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerID="a1bb2ec19e696327fe0eb9b0009531486596f8776118b943b7aa52f48a6e0e05" exitCode=0 Apr 20 19:16:44.990501 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:44.990117 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" event={"ID":"b0da5bdd-61df-4f1f-af5c-757e05372336","Type":"ContainerDied","Data":"a1bb2ec19e696327fe0eb9b0009531486596f8776118b943b7aa52f48a6e0e05"} Apr 20 19:16:46.116437 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.116415 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:16:46.215544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.215509 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-bundle\") pod \"b0da5bdd-61df-4f1f-af5c-757e05372336\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " Apr 20 19:16:46.215730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.215560 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz2qs\" (UniqueName: \"kubernetes.io/projected/b0da5bdd-61df-4f1f-af5c-757e05372336-kube-api-access-fz2qs\") pod \"b0da5bdd-61df-4f1f-af5c-757e05372336\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " Apr 20 19:16:46.215730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.215615 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-util\") pod \"b0da5bdd-61df-4f1f-af5c-757e05372336\" (UID: \"b0da5bdd-61df-4f1f-af5c-757e05372336\") " Apr 20 19:16:46.216445 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.216409 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-bundle" (OuterVolumeSpecName: "bundle") pod "b0da5bdd-61df-4f1f-af5c-757e05372336" (UID: "b0da5bdd-61df-4f1f-af5c-757e05372336"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:16:46.217771 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.217749 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0da5bdd-61df-4f1f-af5c-757e05372336-kube-api-access-fz2qs" (OuterVolumeSpecName: "kube-api-access-fz2qs") pod "b0da5bdd-61df-4f1f-af5c-757e05372336" (UID: "b0da5bdd-61df-4f1f-af5c-757e05372336"). InnerVolumeSpecName "kube-api-access-fz2qs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:16:46.220963 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.220925 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-util" (OuterVolumeSpecName: "util") pod "b0da5bdd-61df-4f1f-af5c-757e05372336" (UID: "b0da5bdd-61df-4f1f-af5c-757e05372336"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:16:46.316493 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.316417 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:46.316493 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.316457 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0da5bdd-61df-4f1f-af5c-757e05372336-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:46.316493 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.316466 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fz2qs\" (UniqueName: \"kubernetes.io/projected/b0da5bdd-61df-4f1f-af5c-757e05372336-kube-api-access-fz2qs\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:16:46.998547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.998507 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" event={"ID":"b0da5bdd-61df-4f1f-af5c-757e05372336","Type":"ContainerDied","Data":"c3534115ef201637b59e7091cea5a3d58f6f9d569e51bde66a81388ac62b4457"} Apr 20 19:16:46.998547 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.998545 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3534115ef201637b59e7091cea5a3d58f6f9d569e51bde66a81388ac62b4457" Apr 20 19:16:46.998759 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:16:46.998581 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c26frm7" Apr 20 19:17:03.270828 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.270736 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv"] Apr 20 19:17:03.271272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.271097 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerName="pull" Apr 20 19:17:03.271272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.271111 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerName="pull" Apr 20 19:17:03.271272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.271120 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerName="extract" Apr 20 19:17:03.271272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.271126 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerName="extract" Apr 20 19:17:03.271272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.271142 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerName="util" Apr 20 19:17:03.271272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.271148 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerName="util" Apr 20 19:17:03.271272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.271194 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0da5bdd-61df-4f1f-af5c-757e05372336" containerName="extract" Apr 20 19:17:03.274184 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.274162 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.277092 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.277068 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-2989n\"" Apr 20 19:17:03.277274 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.277255 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 19:17:03.287375 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.287353 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv"] Apr 20 19:17:03.357684 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357642 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.357684 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357683 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.357937 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357713 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/16786c15-23e8-40e4-aacf-5e901d7ae46e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.357937 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357766 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.357937 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357807 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mj4v\" (UniqueName: \"kubernetes.io/projected/16786c15-23e8-40e4-aacf-5e901d7ae46e-kube-api-access-4mj4v\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.357937 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357829 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.357937 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357867 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.357937 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357910 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.358162 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.357942 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.458683 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.458644 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.458851 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.458693 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.458851 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.458715 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.458942 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.458901 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/16786c15-23e8-40e4-aacf-5e901d7ae46e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459005 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.458986 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459059 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459040 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459116 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459062 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mj4v\" (UniqueName: \"kubernetes.io/projected/16786c15-23e8-40e4-aacf-5e901d7ae46e-kube-api-access-4mj4v\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459116 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459103 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459219 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459182 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459275 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459226 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459577 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459551 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459655 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459580 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459655 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459618 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/16786c15-23e8-40e4-aacf-5e901d7ae46e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.459914 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.459891 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.461252 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.461216 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.461440 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.461423 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.467940 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.467918 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mj4v\" (UniqueName: \"kubernetes.io/projected/16786c15-23e8-40e4-aacf-5e901d7ae46e-kube-api-access-4mj4v\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.468050 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.467993 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16786c15-23e8-40e4-aacf-5e901d7ae46e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8jvsv\" (UID: \"16786c15-23e8-40e4-aacf-5e901d7ae46e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.584021 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.583994 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:03.729042 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:03.729007 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv"] Apr 20 19:17:03.730674 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:03.730645 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16786c15_23e8_40e4_aacf_5e901d7ae46e.slice/crio-67246fd120e44ee260819f5efe7db0187904e8244ad3c0f6147559013ccdc99a WatchSource:0}: Error finding container 67246fd120e44ee260819f5efe7db0187904e8244ad3c0f6147559013ccdc99a: Status 404 returned error can't find the container with id 67246fd120e44ee260819f5efe7db0187904e8244ad3c0f6147559013ccdc99a Apr 20 19:17:04.058150 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:04.058058 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" event={"ID":"16786c15-23e8-40e4-aacf-5e901d7ae46e","Type":"ContainerStarted","Data":"67246fd120e44ee260819f5efe7db0187904e8244ad3c0f6147559013ccdc99a"} Apr 20 19:17:06.058589 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:06.058546 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 19:17:06.058880 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:06.058626 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 19:17:06.058880 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:06.058662 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 19:17:07.069963 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:07.069925 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" event={"ID":"16786c15-23e8-40e4-aacf-5e901d7ae46e","Type":"ContainerStarted","Data":"cfe1cfa22ae045641d9e13977146eb62883a7822a923e4055cabef8bce179da0"} Apr 20 19:17:07.091135 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:07.091091 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" podStartSLOduration=1.765678641 podStartE2EDuration="4.091077789s" podCreationTimestamp="2026-04-20 19:17:03 +0000 UTC" firstStartedPulling="2026-04-20 19:17:03.732879281 +0000 UTC m=+529.965169785" lastFinishedPulling="2026-04-20 19:17:06.058278444 +0000 UTC m=+532.290568933" observedRunningTime="2026-04-20 19:17:07.088790729 +0000 UTC m=+533.321081237" watchObservedRunningTime="2026-04-20 19:17:07.091077789 +0000 UTC m=+533.323368298" Apr 20 19:17:07.585036 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:07.585001 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:07.589759 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:07.589733 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:08.074003 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:08.073974 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:08.074901 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:08.074883 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8jvsv" Apr 20 19:17:18.245208 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.245171 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zsqb4"] Apr 20 19:17:18.249344 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.249300 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" Apr 20 19:17:18.252248 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.252230 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:17:18.252355 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.252260 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:17:18.253267 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.253251 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-tm79h\"" Apr 20 19:17:18.261544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.261518 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zsqb4"] Apr 20 19:17:18.287586 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.287558 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x24sk\" (UniqueName: \"kubernetes.io/projected/0a25f2fe-2ed3-4a45-a8df-7662318ec965-kube-api-access-x24sk\") pod \"kuadrant-operator-catalog-zsqb4\" (UID: \"0a25f2fe-2ed3-4a45-a8df-7662318ec965\") " pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" Apr 20 19:17:18.388275 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.388244 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24sk\" (UniqueName: \"kubernetes.io/projected/0a25f2fe-2ed3-4a45-a8df-7662318ec965-kube-api-access-x24sk\") pod \"kuadrant-operator-catalog-zsqb4\" (UID: \"0a25f2fe-2ed3-4a45-a8df-7662318ec965\") " pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" Apr 20 19:17:18.397186 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.397154 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x24sk\" (UniqueName: \"kubernetes.io/projected/0a25f2fe-2ed3-4a45-a8df-7662318ec965-kube-api-access-x24sk\") pod \"kuadrant-operator-catalog-zsqb4\" (UID: \"0a25f2fe-2ed3-4a45-a8df-7662318ec965\") " pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" Apr 20 19:17:18.558516 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.558432 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" Apr 20 19:17:18.608718 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.608686 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zsqb4"] Apr 20 19:17:18.683222 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.683195 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zsqb4"] Apr 20 19:17:18.685014 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:18.684981 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a25f2fe_2ed3_4a45_a8df_7662318ec965.slice/crio-d78b6940b6c80facd339fd6dd3dac758c67ccb9e05c370a01b627d3f31e32c33 WatchSource:0}: Error finding container d78b6940b6c80facd339fd6dd3dac758c67ccb9e05c370a01b627d3f31e32c33: Status 404 returned error can't find the container with id d78b6940b6c80facd339fd6dd3dac758c67ccb9e05c370a01b627d3f31e32c33 Apr 20 19:17:18.818956 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.818925 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nbm7p"] Apr 20 19:17:18.823432 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.823413 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:18.829520 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.829484 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nbm7p"] Apr 20 19:17:18.893155 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.893122 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzmpl\" (UniqueName: \"kubernetes.io/projected/b74fbe7d-f0b6-49ce-a4e7-32a21289dab5-kube-api-access-xzmpl\") pod \"kuadrant-operator-catalog-nbm7p\" (UID: \"b74fbe7d-f0b6-49ce-a4e7-32a21289dab5\") " pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:18.994005 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:18.993967 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzmpl\" (UniqueName: \"kubernetes.io/projected/b74fbe7d-f0b6-49ce-a4e7-32a21289dab5-kube-api-access-xzmpl\") pod \"kuadrant-operator-catalog-nbm7p\" (UID: \"b74fbe7d-f0b6-49ce-a4e7-32a21289dab5\") " pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:19.002848 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:19.002825 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzmpl\" (UniqueName: \"kubernetes.io/projected/b74fbe7d-f0b6-49ce-a4e7-32a21289dab5-kube-api-access-xzmpl\") pod \"kuadrant-operator-catalog-nbm7p\" (UID: \"b74fbe7d-f0b6-49ce-a4e7-32a21289dab5\") " pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:19.114630 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:19.114545 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" event={"ID":"0a25f2fe-2ed3-4a45-a8df-7662318ec965","Type":"ContainerStarted","Data":"d78b6940b6c80facd339fd6dd3dac758c67ccb9e05c370a01b627d3f31e32c33"} Apr 20 19:17:19.133813 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:19.133783 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:19.257994 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:19.257959 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nbm7p"] Apr 20 19:17:19.261185 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:19.261158 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb74fbe7d_f0b6_49ce_a4e7_32a21289dab5.slice/crio-f5bbadfc17d7d231b4b082123742da0ebf5a20fbe7c43937789b2eec5a5c2853 WatchSource:0}: Error finding container f5bbadfc17d7d231b4b082123742da0ebf5a20fbe7c43937789b2eec5a5c2853: Status 404 returned error can't find the container with id f5bbadfc17d7d231b4b082123742da0ebf5a20fbe7c43937789b2eec5a5c2853 Apr 20 19:17:20.120974 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:20.120931 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" event={"ID":"b74fbe7d-f0b6-49ce-a4e7-32a21289dab5","Type":"ContainerStarted","Data":"f5bbadfc17d7d231b4b082123742da0ebf5a20fbe7c43937789b2eec5a5c2853"} Apr 20 19:17:21.125920 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.125883 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" event={"ID":"0a25f2fe-2ed3-4a45-a8df-7662318ec965","Type":"ContainerStarted","Data":"165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d"} Apr 20 19:17:21.125920 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.125904 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" podUID="0a25f2fe-2ed3-4a45-a8df-7662318ec965" containerName="registry-server" containerID="cri-o://165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d" gracePeriod=2 Apr 20 19:17:21.127558 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.127533 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" event={"ID":"b74fbe7d-f0b6-49ce-a4e7-32a21289dab5","Type":"ContainerStarted","Data":"21b8cd26087a381afb8c9985c6b4d0e652d35566cbb8f2879fcee49ff2faa90f"} Apr 20 19:17:21.142497 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.142436 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" podStartSLOduration=0.788001415 podStartE2EDuration="3.142420392s" podCreationTimestamp="2026-04-20 19:17:18 +0000 UTC" firstStartedPulling="2026-04-20 19:17:18.686410583 +0000 UTC m=+544.918701070" lastFinishedPulling="2026-04-20 19:17:21.040829558 +0000 UTC m=+547.273120047" observedRunningTime="2026-04-20 19:17:21.141135005 +0000 UTC m=+547.373425531" watchObservedRunningTime="2026-04-20 19:17:21.142420392 +0000 UTC m=+547.374710902" Apr 20 19:17:21.160019 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.159959 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" podStartSLOduration=1.37827752 podStartE2EDuration="3.159946213s" podCreationTimestamp="2026-04-20 19:17:18 +0000 UTC" firstStartedPulling="2026-04-20 19:17:19.262651799 +0000 UTC m=+545.494942286" lastFinishedPulling="2026-04-20 19:17:21.04432049 +0000 UTC m=+547.276610979" observedRunningTime="2026-04-20 19:17:21.159124396 +0000 UTC m=+547.391414922" watchObservedRunningTime="2026-04-20 19:17:21.159946213 +0000 UTC m=+547.392236721" Apr 20 19:17:21.366481 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.366458 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-zsqb4_0a25f2fe-2ed3-4a45-a8df-7662318ec965/registry-server/0.log" Apr 20 19:17:21.366598 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.366514 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" Apr 20 19:17:21.415863 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.415834 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x24sk\" (UniqueName: \"kubernetes.io/projected/0a25f2fe-2ed3-4a45-a8df-7662318ec965-kube-api-access-x24sk\") pod \"0a25f2fe-2ed3-4a45-a8df-7662318ec965\" (UID: \"0a25f2fe-2ed3-4a45-a8df-7662318ec965\") " Apr 20 19:17:21.418049 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.418029 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a25f2fe-2ed3-4a45-a8df-7662318ec965-kube-api-access-x24sk" (OuterVolumeSpecName: "kube-api-access-x24sk") pod "0a25f2fe-2ed3-4a45-a8df-7662318ec965" (UID: "0a25f2fe-2ed3-4a45-a8df-7662318ec965"). InnerVolumeSpecName "kube-api-access-x24sk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:17:21.516736 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:21.516702 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x24sk\" (UniqueName: \"kubernetes.io/projected/0a25f2fe-2ed3-4a45-a8df-7662318ec965-kube-api-access-x24sk\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:22.131758 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.131731 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-zsqb4_0a25f2fe-2ed3-4a45-a8df-7662318ec965/registry-server/0.log" Apr 20 19:17:22.132225 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.131769 2583 generic.go:358] "Generic (PLEG): container finished" podID="0a25f2fe-2ed3-4a45-a8df-7662318ec965" containerID="165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d" exitCode=2 Apr 20 19:17:22.132225 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.131809 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" event={"ID":"0a25f2fe-2ed3-4a45-a8df-7662318ec965","Type":"ContainerDied","Data":"165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d"} Apr 20 19:17:22.132225 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.131847 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" event={"ID":"0a25f2fe-2ed3-4a45-a8df-7662318ec965","Type":"ContainerDied","Data":"d78b6940b6c80facd339fd6dd3dac758c67ccb9e05c370a01b627d3f31e32c33"} Apr 20 19:17:22.132225 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.131863 2583 scope.go:117] "RemoveContainer" containerID="165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d" Apr 20 19:17:22.132225 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.131860 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zsqb4" Apr 20 19:17:22.141057 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.141040 2583 scope.go:117] "RemoveContainer" containerID="165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d" Apr 20 19:17:22.141359 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:17:22.141300 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d\": container with ID starting with 165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d not found: ID does not exist" containerID="165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d" Apr 20 19:17:22.141455 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.141371 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d"} err="failed to get container status \"165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d\": rpc error: code = NotFound desc = could not find container \"165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d\": container with ID starting with 165ff52fca54c70879c88c6528879907681c8a4670bd53e14d853456bdecbb8d not found: ID does not exist" Apr 20 19:17:22.154498 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.154474 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zsqb4"] Apr 20 19:17:22.158683 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.158658 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zsqb4"] Apr 20 19:17:22.361357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:22.361300 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a25f2fe-2ed3-4a45-a8df-7662318ec965" path="/var/lib/kubelet/pods/0a25f2fe-2ed3-4a45-a8df-7662318ec965/volumes" Apr 20 19:17:25.440705 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.440669 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7fd79b5855-7pg5g"] Apr 20 19:17:25.441118 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.441017 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a25f2fe-2ed3-4a45-a8df-7662318ec965" containerName="registry-server" Apr 20 19:17:25.441118 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.441033 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a25f2fe-2ed3-4a45-a8df-7662318ec965" containerName="registry-server" Apr 20 19:17:25.441118 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.441108 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a25f2fe-2ed3-4a45-a8df-7662318ec965" containerName="registry-server" Apr 20 19:17:25.446129 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.446108 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.458362 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.458335 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fd79b5855-7pg5g"] Apr 20 19:17:25.551539 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.551505 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f017cc90-7be5-407c-a49d-dc0386f71a29-console-oauth-config\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.551696 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.551550 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f017cc90-7be5-407c-a49d-dc0386f71a29-console-serving-cert\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.551696 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.551594 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-console-config\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.551696 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.551616 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc76c\" (UniqueName: \"kubernetes.io/projected/f017cc90-7be5-407c-a49d-dc0386f71a29-kube-api-access-jc76c\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.551696 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.551651 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-oauth-serving-cert\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.551696 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.551696 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-service-ca\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.551874 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.551713 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-trusted-ca-bundle\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.652864 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.652829 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-oauth-serving-cert\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.652864 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.652870 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-service-ca\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653094 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.652897 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-trusted-ca-bundle\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653094 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.652947 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f017cc90-7be5-407c-a49d-dc0386f71a29-console-oauth-config\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653094 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.652998 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f017cc90-7be5-407c-a49d-dc0386f71a29-console-serving-cert\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653094 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.653032 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-console-config\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653094 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.653061 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jc76c\" (UniqueName: \"kubernetes.io/projected/f017cc90-7be5-407c-a49d-dc0386f71a29-kube-api-access-jc76c\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653669 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.653642 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-service-ca\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653780 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.653709 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-oauth-serving-cert\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653780 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.653718 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-console-config\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.653864 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.653844 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f017cc90-7be5-407c-a49d-dc0386f71a29-trusted-ca-bundle\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.656159 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.656136 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f017cc90-7be5-407c-a49d-dc0386f71a29-console-oauth-config\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.656364 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.656343 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f017cc90-7be5-407c-a49d-dc0386f71a29-console-serving-cert\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.663685 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.663660 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc76c\" (UniqueName: \"kubernetes.io/projected/f017cc90-7be5-407c-a49d-dc0386f71a29-kube-api-access-jc76c\") pod \"console-7fd79b5855-7pg5g\" (UID: \"f017cc90-7be5-407c-a49d-dc0386f71a29\") " pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.755910 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.755805 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:25.879989 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:25.879869 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fd79b5855-7pg5g"] Apr 20 19:17:25.881935 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:25.881910 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf017cc90_7be5_407c_a49d_dc0386f71a29.slice/crio-e93bcf9327045386204786a15c13bce7baa2543bc2be2e28906322b2013e8f9a WatchSource:0}: Error finding container e93bcf9327045386204786a15c13bce7baa2543bc2be2e28906322b2013e8f9a: Status 404 returned error can't find the container with id e93bcf9327045386204786a15c13bce7baa2543bc2be2e28906322b2013e8f9a Apr 20 19:17:26.152701 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:26.152669 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fd79b5855-7pg5g" event={"ID":"f017cc90-7be5-407c-a49d-dc0386f71a29","Type":"ContainerStarted","Data":"df206a3aac71541a338f2c6289478e72456c990095b63e16b9a5f3c0d8542898"} Apr 20 19:17:26.152701 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:26.152703 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fd79b5855-7pg5g" event={"ID":"f017cc90-7be5-407c-a49d-dc0386f71a29","Type":"ContainerStarted","Data":"e93bcf9327045386204786a15c13bce7baa2543bc2be2e28906322b2013e8f9a"} Apr 20 19:17:26.172439 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:26.172385 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7fd79b5855-7pg5g" podStartSLOduration=1.172371064 podStartE2EDuration="1.172371064s" podCreationTimestamp="2026-04-20 19:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:17:26.170074752 +0000 UTC m=+552.402365261" watchObservedRunningTime="2026-04-20 19:17:26.172371064 +0000 UTC m=+552.404661572" Apr 20 19:17:29.134176 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:29.134138 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:29.134176 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:29.134185 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:29.156145 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:29.156119 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:29.183687 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:29.183659 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-nbm7p" Apr 20 19:17:30.254747 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.254712 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr"] Apr 20 19:17:30.258404 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.258388 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.260872 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.260850 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ggnhw\"" Apr 20 19:17:30.265544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.265521 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr"] Apr 20 19:17:30.294549 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.294520 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.294743 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.294566 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.294743 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.294631 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn76k\" (UniqueName: \"kubernetes.io/projected/dc114f12-adc1-421a-96a0-569fc1da86aa-kube-api-access-pn76k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.395966 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.395936 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.396100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.395983 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.396100 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.396042 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pn76k\" (UniqueName: \"kubernetes.io/projected/dc114f12-adc1-421a-96a0-569fc1da86aa-kube-api-access-pn76k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.396348 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.396300 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.396415 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.396399 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.406633 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.406613 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn76k\" (UniqueName: \"kubernetes.io/projected/dc114f12-adc1-421a-96a0-569fc1da86aa-kube-api-access-pn76k\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.568877 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.568853 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:30.696097 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.696068 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr"] Apr 20 19:17:30.697702 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:30.697674 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc114f12_adc1_421a_96a0_569fc1da86aa.slice/crio-dca25353a1c1498bff436b9ad49bd78b280d1db9f971e63aa5f9888a5bfccdb3 WatchSource:0}: Error finding container dca25353a1c1498bff436b9ad49bd78b280d1db9f971e63aa5f9888a5bfccdb3: Status 404 returned error can't find the container with id dca25353a1c1498bff436b9ad49bd78b280d1db9f971e63aa5f9888a5bfccdb3 Apr 20 19:17:30.856372 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.856259 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7"] Apr 20 19:17:30.859687 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.859666 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:30.867875 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.867854 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7"] Apr 20 19:17:30.899837 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.899815 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrvg7\" (UniqueName: \"kubernetes.io/projected/116a59bd-7951-46e1-9b32-6b87612bb943-kube-api-access-vrvg7\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:30.899973 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.899859 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:30.899973 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:30.899883 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:31.000931 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.000900 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:31.001106 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.000980 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrvg7\" (UniqueName: \"kubernetes.io/projected/116a59bd-7951-46e1-9b32-6b87612bb943-kube-api-access-vrvg7\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:31.001106 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.001018 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:31.001348 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.001290 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:31.001348 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.001332 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:31.009081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.009064 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrvg7\" (UniqueName: \"kubernetes.io/projected/116a59bd-7951-46e1-9b32-6b87612bb943-kube-api-access-vrvg7\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:31.168833 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.168759 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:31.170962 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.170929 2583 generic.go:358] "Generic (PLEG): container finished" podID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerID="9d6d57bd0644f6430db57b0836f2ba2258bcd88d7946e39cfcf8eeada975fe5f" exitCode=0 Apr 20 19:17:31.171091 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.171015 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" event={"ID":"dc114f12-adc1-421a-96a0-569fc1da86aa","Type":"ContainerDied","Data":"9d6d57bd0644f6430db57b0836f2ba2258bcd88d7946e39cfcf8eeada975fe5f"} Apr 20 19:17:31.171091 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.171054 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" event={"ID":"dc114f12-adc1-421a-96a0-569fc1da86aa","Type":"ContainerStarted","Data":"dca25353a1c1498bff436b9ad49bd78b280d1db9f971e63aa5f9888a5bfccdb3"} Apr 20 19:17:31.293865 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.293840 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7"] Apr 20 19:17:31.296002 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:31.295975 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod116a59bd_7951_46e1_9b32_6b87612bb943.slice/crio-b344a42b86e6bd99f857b71523e251dbd069db0c514681c7db825ab8f8a6559e WatchSource:0}: Error finding container b344a42b86e6bd99f857b71523e251dbd069db0c514681c7db825ab8f8a6559e: Status 404 returned error can't find the container with id b344a42b86e6bd99f857b71523e251dbd069db0c514681c7db825ab8f8a6559e Apr 20 19:17:31.466854 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.466818 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz"] Apr 20 19:17:31.470481 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.470466 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.486663 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.486638 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz"] Apr 20 19:17:31.504693 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.504663 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjmt\" (UniqueName: \"kubernetes.io/projected/4502f57e-2135-44c0-ae6b-b58286052cb1-kube-api-access-fjjmt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.504810 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.504720 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.504810 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.504783 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.605890 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.605859 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.606055 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.605902 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.606055 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.605965 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjmt\" (UniqueName: \"kubernetes.io/projected/4502f57e-2135-44c0-ae6b-b58286052cb1-kube-api-access-fjjmt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.606246 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.606226 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.606286 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.606240 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.621012 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.615909 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjmt\" (UniqueName: \"kubernetes.io/projected/4502f57e-2135-44c0-ae6b-b58286052cb1-kube-api-access-fjjmt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.779030 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.778945 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:31.941703 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:31.941670 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz"] Apr 20 19:17:31.961551 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:31.961506 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4502f57e_2135_44c0_ae6b_b58286052cb1.slice/crio-d713211cdf1a389d023ae6f3b1632d5d93b2604e82b531701249affc34517be6 WatchSource:0}: Error finding container d713211cdf1a389d023ae6f3b1632d5d93b2604e82b531701249affc34517be6: Status 404 returned error can't find the container with id d713211cdf1a389d023ae6f3b1632d5d93b2604e82b531701249affc34517be6 Apr 20 19:17:32.115132 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.115100 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s"] Apr 20 19:17:32.118576 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.118561 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.137368 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.137340 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s"] Apr 20 19:17:32.175272 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.175245 2583 generic.go:358] "Generic (PLEG): container finished" podID="116a59bd-7951-46e1-9b32-6b87612bb943" containerID="389b6f8371a71a12006240a2b32b2a886786f2a1c0830262f5436b6ca00e93e1" exitCode=0 Apr 20 19:17:32.175414 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.175339 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" event={"ID":"116a59bd-7951-46e1-9b32-6b87612bb943","Type":"ContainerDied","Data":"389b6f8371a71a12006240a2b32b2a886786f2a1c0830262f5436b6ca00e93e1"} Apr 20 19:17:32.175414 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.175368 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" event={"ID":"116a59bd-7951-46e1-9b32-6b87612bb943","Type":"ContainerStarted","Data":"b344a42b86e6bd99f857b71523e251dbd069db0c514681c7db825ab8f8a6559e"} Apr 20 19:17:32.176813 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.176794 2583 generic.go:358] "Generic (PLEG): container finished" podID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerID="937157ebeb4b3a7892afc4b4c11a06937871c7b487c1af6127471f3d29d3435e" exitCode=0 Apr 20 19:17:32.176928 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.176860 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" event={"ID":"4502f57e-2135-44c0-ae6b-b58286052cb1","Type":"ContainerDied","Data":"937157ebeb4b3a7892afc4b4c11a06937871c7b487c1af6127471f3d29d3435e"} Apr 20 19:17:32.176928 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.176890 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" event={"ID":"4502f57e-2135-44c0-ae6b-b58286052cb1","Type":"ContainerStarted","Data":"d713211cdf1a389d023ae6f3b1632d5d93b2604e82b531701249affc34517be6"} Apr 20 19:17:32.178792 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.178767 2583 generic.go:358] "Generic (PLEG): container finished" podID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerID="caa3bf8bba7b7af52becfbc096553c0955ae63e6d76f2f73e2c7e20501e24e2b" exitCode=0 Apr 20 19:17:32.178881 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.178859 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" event={"ID":"dc114f12-adc1-421a-96a0-569fc1da86aa","Type":"ContainerDied","Data":"caa3bf8bba7b7af52becfbc096553c0955ae63e6d76f2f73e2c7e20501e24e2b"} Apr 20 19:17:32.212394 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.212368 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.212500 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.212458 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.212500 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.212485 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4wtc\" (UniqueName: \"kubernetes.io/projected/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-kube-api-access-t4wtc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.313357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.313253 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.313357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.313337 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.313357 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.313357 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4wtc\" (UniqueName: \"kubernetes.io/projected/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-kube-api-access-t4wtc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.313883 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.313682 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.313883 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.313757 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.322355 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.322329 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4wtc\" (UniqueName: \"kubernetes.io/projected/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-kube-api-access-t4wtc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.454448 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.454412 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:32.588741 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:32.588712 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s"] Apr 20 19:17:32.590095 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:32.590064 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1294cc_e8dd_4819_ab88_7bf81cc2695c.slice/crio-60416d374269d97b445b208a7ef08548d8400a9a41330d9ce380c363cf44843e WatchSource:0}: Error finding container 60416d374269d97b445b208a7ef08548d8400a9a41330d9ce380c363cf44843e: Status 404 returned error can't find the container with id 60416d374269d97b445b208a7ef08548d8400a9a41330d9ce380c363cf44843e Apr 20 19:17:33.184840 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.184807 2583 generic.go:358] "Generic (PLEG): container finished" podID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerID="33c80e1eb5e318e9844a6b2ef93b3857868b28712f6de6dcf5ccaebe40ca5a49" exitCode=0 Apr 20 19:17:33.185039 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.184886 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" event={"ID":"4502f57e-2135-44c0-ae6b-b58286052cb1","Type":"ContainerDied","Data":"33c80e1eb5e318e9844a6b2ef93b3857868b28712f6de6dcf5ccaebe40ca5a49"} Apr 20 19:17:33.186943 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.186911 2583 generic.go:358] "Generic (PLEG): container finished" podID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerID="59e10f49b8bb4d96a517c311bbb59b0c3ad82194f9142e434e080d9daedadd9d" exitCode=0 Apr 20 19:17:33.187039 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.186975 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" event={"ID":"dc114f12-adc1-421a-96a0-569fc1da86aa","Type":"ContainerDied","Data":"59e10f49b8bb4d96a517c311bbb59b0c3ad82194f9142e434e080d9daedadd9d"} Apr 20 19:17:33.188251 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.188231 2583 generic.go:358] "Generic (PLEG): container finished" podID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerID="4c03c50127590064934a52810f1f538f2a92a54bc5b21b9df00fbc1cf7066858" exitCode=0 Apr 20 19:17:33.188378 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.188322 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" event={"ID":"6d1294cc-e8dd-4819-ab88-7bf81cc2695c","Type":"ContainerDied","Data":"4c03c50127590064934a52810f1f538f2a92a54bc5b21b9df00fbc1cf7066858"} Apr 20 19:17:33.188378 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.188347 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" event={"ID":"6d1294cc-e8dd-4819-ab88-7bf81cc2695c","Type":"ContainerStarted","Data":"60416d374269d97b445b208a7ef08548d8400a9a41330d9ce380c363cf44843e"} Apr 20 19:17:33.190225 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.190157 2583 generic.go:358] "Generic (PLEG): container finished" podID="116a59bd-7951-46e1-9b32-6b87612bb943" containerID="7a5a00cc77b52e6d218f640d4a18c8b99f8d0d8ed432debe1319f1e49c626b44" exitCode=0 Apr 20 19:17:33.190225 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:33.190193 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" event={"ID":"116a59bd-7951-46e1-9b32-6b87612bb943","Type":"ContainerDied","Data":"7a5a00cc77b52e6d218f640d4a18c8b99f8d0d8ed432debe1319f1e49c626b44"} Apr 20 19:17:34.202546 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.202513 2583 generic.go:358] "Generic (PLEG): container finished" podID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerID="4ea37d0661021be27fd7fbb9ec303a39b578c7aaa30349f665499fbf26c16bdd" exitCode=0 Apr 20 19:17:34.202966 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.202593 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" event={"ID":"6d1294cc-e8dd-4819-ab88-7bf81cc2695c","Type":"ContainerDied","Data":"4ea37d0661021be27fd7fbb9ec303a39b578c7aaa30349f665499fbf26c16bdd"} Apr 20 19:17:34.204711 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.204690 2583 generic.go:358] "Generic (PLEG): container finished" podID="116a59bd-7951-46e1-9b32-6b87612bb943" containerID="77bfb07d45b72c18812ed8ccb1fc2b765c1487a4677f933d56735a1743b916bb" exitCode=0 Apr 20 19:17:34.204786 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.204750 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" event={"ID":"116a59bd-7951-46e1-9b32-6b87612bb943","Type":"ContainerDied","Data":"77bfb07d45b72c18812ed8ccb1fc2b765c1487a4677f933d56735a1743b916bb"} Apr 20 19:17:34.210387 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.210272 2583 generic.go:358] "Generic (PLEG): container finished" podID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerID="2142078d1b3fa6646f081cde0b11b3240a4fbb71725ac7e22c39b0fd204f0836" exitCode=0 Apr 20 19:17:34.210387 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.210301 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" event={"ID":"4502f57e-2135-44c0-ae6b-b58286052cb1","Type":"ContainerDied","Data":"2142078d1b3fa6646f081cde0b11b3240a4fbb71725ac7e22c39b0fd204f0836"} Apr 20 19:17:34.413822 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.413800 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:34.535081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.534995 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-bundle\") pod \"dc114f12-adc1-421a-96a0-569fc1da86aa\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " Apr 20 19:17:34.535227 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.535088 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn76k\" (UniqueName: \"kubernetes.io/projected/dc114f12-adc1-421a-96a0-569fc1da86aa-kube-api-access-pn76k\") pod \"dc114f12-adc1-421a-96a0-569fc1da86aa\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " Apr 20 19:17:34.535227 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.535133 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-util\") pod \"dc114f12-adc1-421a-96a0-569fc1da86aa\" (UID: \"dc114f12-adc1-421a-96a0-569fc1da86aa\") " Apr 20 19:17:34.535640 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.535615 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-bundle" (OuterVolumeSpecName: "bundle") pod "dc114f12-adc1-421a-96a0-569fc1da86aa" (UID: "dc114f12-adc1-421a-96a0-569fc1da86aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:17:34.537291 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.537265 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc114f12-adc1-421a-96a0-569fc1da86aa-kube-api-access-pn76k" (OuterVolumeSpecName: "kube-api-access-pn76k") pod "dc114f12-adc1-421a-96a0-569fc1da86aa" (UID: "dc114f12-adc1-421a-96a0-569fc1da86aa"). InnerVolumeSpecName "kube-api-access-pn76k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:17:34.540474 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.540451 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-util" (OuterVolumeSpecName: "util") pod "dc114f12-adc1-421a-96a0-569fc1da86aa" (UID: "dc114f12-adc1-421a-96a0-569fc1da86aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:17:34.636278 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.636247 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pn76k\" (UniqueName: \"kubernetes.io/projected/dc114f12-adc1-421a-96a0-569fc1da86aa-kube-api-access-pn76k\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:34.636278 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.636273 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:34.636278 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:34.636286 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc114f12-adc1-421a-96a0-569fc1da86aa-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:35.215746 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.215715 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" Apr 20 19:17:35.216188 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.215712 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr" event={"ID":"dc114f12-adc1-421a-96a0-569fc1da86aa","Type":"ContainerDied","Data":"dca25353a1c1498bff436b9ad49bd78b280d1db9f971e63aa5f9888a5bfccdb3"} Apr 20 19:17:35.216188 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.215794 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dca25353a1c1498bff436b9ad49bd78b280d1db9f971e63aa5f9888a5bfccdb3" Apr 20 19:17:35.217914 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.217884 2583 generic.go:358] "Generic (PLEG): container finished" podID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerID="f9352152fb91de81c38dce924096d8fa30c9a321cfcf216fdb82dd915c0df2af" exitCode=0 Apr 20 19:17:35.218029 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.217966 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" event={"ID":"6d1294cc-e8dd-4819-ab88-7bf81cc2695c","Type":"ContainerDied","Data":"f9352152fb91de81c38dce924096d8fa30c9a321cfcf216fdb82dd915c0df2af"} Apr 20 19:17:35.351945 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.351922 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:35.369008 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.368987 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:35.442237 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.442210 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-util\") pod \"4502f57e-2135-44c0-ae6b-b58286052cb1\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " Apr 20 19:17:35.442419 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.442268 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-bundle\") pod \"116a59bd-7951-46e1-9b32-6b87612bb943\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " Apr 20 19:17:35.442419 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.442294 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrvg7\" (UniqueName: \"kubernetes.io/projected/116a59bd-7951-46e1-9b32-6b87612bb943-kube-api-access-vrvg7\") pod \"116a59bd-7951-46e1-9b32-6b87612bb943\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " Apr 20 19:17:35.442419 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.442361 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-bundle\") pod \"4502f57e-2135-44c0-ae6b-b58286052cb1\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " Apr 20 19:17:35.442419 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.442390 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-util\") pod \"116a59bd-7951-46e1-9b32-6b87612bb943\" (UID: \"116a59bd-7951-46e1-9b32-6b87612bb943\") " Apr 20 19:17:35.442419 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.442415 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjmt\" (UniqueName: \"kubernetes.io/projected/4502f57e-2135-44c0-ae6b-b58286052cb1-kube-api-access-fjjmt\") pod \"4502f57e-2135-44c0-ae6b-b58286052cb1\" (UID: \"4502f57e-2135-44c0-ae6b-b58286052cb1\") " Apr 20 19:17:35.442960 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.442912 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-bundle" (OuterVolumeSpecName: "bundle") pod "116a59bd-7951-46e1-9b32-6b87612bb943" (UID: "116a59bd-7951-46e1-9b32-6b87612bb943"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:17:35.442960 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.442926 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-bundle" (OuterVolumeSpecName: "bundle") pod "4502f57e-2135-44c0-ae6b-b58286052cb1" (UID: "4502f57e-2135-44c0-ae6b-b58286052cb1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:17:35.444947 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.444919 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4502f57e-2135-44c0-ae6b-b58286052cb1-kube-api-access-fjjmt" (OuterVolumeSpecName: "kube-api-access-fjjmt") pod "4502f57e-2135-44c0-ae6b-b58286052cb1" (UID: "4502f57e-2135-44c0-ae6b-b58286052cb1"). InnerVolumeSpecName "kube-api-access-fjjmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:17:35.445271 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.445245 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/116a59bd-7951-46e1-9b32-6b87612bb943-kube-api-access-vrvg7" (OuterVolumeSpecName: "kube-api-access-vrvg7") pod "116a59bd-7951-46e1-9b32-6b87612bb943" (UID: "116a59bd-7951-46e1-9b32-6b87612bb943"). InnerVolumeSpecName "kube-api-access-vrvg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:17:35.448423 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.448402 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-util" (OuterVolumeSpecName: "util") pod "4502f57e-2135-44c0-ae6b-b58286052cb1" (UID: "4502f57e-2135-44c0-ae6b-b58286052cb1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:17:35.452400 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.452371 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-util" (OuterVolumeSpecName: "util") pod "116a59bd-7951-46e1-9b32-6b87612bb943" (UID: "116a59bd-7951-46e1-9b32-6b87612bb943"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:17:35.543270 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.543188 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:35.543270 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.543219 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vrvg7\" (UniqueName: \"kubernetes.io/projected/116a59bd-7951-46e1-9b32-6b87612bb943-kube-api-access-vrvg7\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:35.543270 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.543233 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:35.543270 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.543245 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/116a59bd-7951-46e1-9b32-6b87612bb943-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:35.543270 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.543266 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjjmt\" (UniqueName: \"kubernetes.io/projected/4502f57e-2135-44c0-ae6b-b58286052cb1-kube-api-access-fjjmt\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:35.543542 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.543279 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4502f57e-2135-44c0-ae6b-b58286052cb1-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:35.756674 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.756635 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:35.756674 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.756677 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:35.761941 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:35.761918 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:36.223555 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.223527 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" Apr 20 19:17:36.223968 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.223518 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7" event={"ID":"116a59bd-7951-46e1-9b32-6b87612bb943","Type":"ContainerDied","Data":"b344a42b86e6bd99f857b71523e251dbd069db0c514681c7db825ab8f8a6559e"} Apr 20 19:17:36.223968 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.223651 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b344a42b86e6bd99f857b71523e251dbd069db0c514681c7db825ab8f8a6559e" Apr 20 19:17:36.225156 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.225138 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" Apr 20 19:17:36.225275 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.225160 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz" event={"ID":"4502f57e-2135-44c0-ae6b-b58286052cb1","Type":"ContainerDied","Data":"d713211cdf1a389d023ae6f3b1632d5d93b2604e82b531701249affc34517be6"} Apr 20 19:17:36.225275 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.225187 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d713211cdf1a389d023ae6f3b1632d5d93b2604e82b531701249affc34517be6" Apr 20 19:17:36.230381 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.230361 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7fd79b5855-7pg5g" Apr 20 19:17:36.283183 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.283023 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d7f8b87c9-9vhcs"] Apr 20 19:17:36.371021 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.368838 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:36.453843 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.453813 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-util\") pod \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " Apr 20 19:17:36.454010 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.453919 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4wtc\" (UniqueName: \"kubernetes.io/projected/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-kube-api-access-t4wtc\") pod \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " Apr 20 19:17:36.454010 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.453954 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-bundle\") pod \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\" (UID: \"6d1294cc-e8dd-4819-ab88-7bf81cc2695c\") " Apr 20 19:17:36.454542 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.454502 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-bundle" (OuterVolumeSpecName: "bundle") pod "6d1294cc-e8dd-4819-ab88-7bf81cc2695c" (UID: "6d1294cc-e8dd-4819-ab88-7bf81cc2695c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:17:36.456164 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.456140 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-kube-api-access-t4wtc" (OuterVolumeSpecName: "kube-api-access-t4wtc") pod "6d1294cc-e8dd-4819-ab88-7bf81cc2695c" (UID: "6d1294cc-e8dd-4819-ab88-7bf81cc2695c"). InnerVolumeSpecName "kube-api-access-t4wtc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:17:36.459440 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.459418 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-util" (OuterVolumeSpecName: "util") pod "6d1294cc-e8dd-4819-ab88-7bf81cc2695c" (UID: "6d1294cc-e8dd-4819-ab88-7bf81cc2695c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:17:36.555188 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.555101 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4wtc\" (UniqueName: \"kubernetes.io/projected/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-kube-api-access-t4wtc\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:36.555188 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.555133 2583 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:36.555188 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:36.555146 2583 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d1294cc-e8dd-4819-ab88-7bf81cc2695c-util\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:17:37.231081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:37.231055 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" Apr 20 19:17:37.231081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:37.231065 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s" event={"ID":"6d1294cc-e8dd-4819-ab88-7bf81cc2695c","Type":"ContainerDied","Data":"60416d374269d97b445b208a7ef08548d8400a9a41330d9ce380c363cf44843e"} Apr 20 19:17:37.231505 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:37.231094 2583 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60416d374269d97b445b208a7ef08548d8400a9a41330d9ce380c363cf44843e" Apr 20 19:17:41.095091 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095053 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb"] Apr 20 19:17:41.095476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095430 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="116a59bd-7951-46e1-9b32-6b87612bb943" containerName="util" Apr 20 19:17:41.095476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095443 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="116a59bd-7951-46e1-9b32-6b87612bb943" containerName="util" Apr 20 19:17:41.095476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095454 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerName="util" Apr 20 19:17:41.095476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095460 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerName="util" Apr 20 19:17:41.095476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095466 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerName="pull" Apr 20 19:17:41.095476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095471 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerName="pull" Apr 20 19:17:41.095476 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095477 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerName="pull" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095482 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerName="pull" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095488 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095493 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095501 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerName="util" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095506 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerName="util" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095511 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095516 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095523 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerName="pull" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095528 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerName="pull" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095539 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="116a59bd-7951-46e1-9b32-6b87612bb943" containerName="pull" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095546 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="116a59bd-7951-46e1-9b32-6b87612bb943" containerName="pull" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095554 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095559 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095566 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerName="util" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095571 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerName="util" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095576 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="116a59bd-7951-46e1-9b32-6b87612bb943" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095581 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="116a59bd-7951-46e1-9b32-6b87612bb943" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095650 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="116a59bd-7951-46e1-9b32-6b87612bb943" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095663 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="4502f57e-2135-44c0-ae6b-b58286052cb1" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095672 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d1294cc-e8dd-4819-ab88-7bf81cc2695c" containerName="extract" Apr 20 19:17:41.095702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.095682 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc114f12-adc1-421a-96a0-569fc1da86aa" containerName="extract" Apr 20 19:17:41.105462 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.105441 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" Apr 20 19:17:41.109517 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.109500 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 19:17:41.109583 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.109500 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-d7dm4\"" Apr 20 19:17:41.116049 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.116025 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb"] Apr 20 19:17:41.196661 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.196627 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cl46\" (UniqueName: \"kubernetes.io/projected/ce6fd9d3-3d80-479c-8d81-378eb8d656e9-kube-api-access-8cl46\") pod \"dns-operator-controller-manager-648d5c98bc-9kfbb\" (UID: \"ce6fd9d3-3d80-479c-8d81-378eb8d656e9\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" Apr 20 19:17:41.297060 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.297021 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cl46\" (UniqueName: \"kubernetes.io/projected/ce6fd9d3-3d80-479c-8d81-378eb8d656e9-kube-api-access-8cl46\") pod \"dns-operator-controller-manager-648d5c98bc-9kfbb\" (UID: \"ce6fd9d3-3d80-479c-8d81-378eb8d656e9\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" Apr 20 19:17:41.306684 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.306664 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cl46\" (UniqueName: \"kubernetes.io/projected/ce6fd9d3-3d80-479c-8d81-378eb8d656e9-kube-api-access-8cl46\") pod \"dns-operator-controller-manager-648d5c98bc-9kfbb\" (UID: \"ce6fd9d3-3d80-479c-8d81-378eb8d656e9\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" Apr 20 19:17:41.416026 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.415951 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" Apr 20 19:17:41.563710 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:41.563682 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb"] Apr 20 19:17:41.564757 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:41.564733 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce6fd9d3_3d80_479c_8d81_378eb8d656e9.slice/crio-0fe3dc09e53326d6a0a41583f5012eb6e14409eea32b5090e9b9cf9f170929fa WatchSource:0}: Error finding container 0fe3dc09e53326d6a0a41583f5012eb6e14409eea32b5090e9b9cf9f170929fa: Status 404 returned error can't find the container with id 0fe3dc09e53326d6a0a41583f5012eb6e14409eea32b5090e9b9cf9f170929fa Apr 20 19:17:42.252744 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:42.252712 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" event={"ID":"ce6fd9d3-3d80-479c-8d81-378eb8d656e9","Type":"ContainerStarted","Data":"0fe3dc09e53326d6a0a41583f5012eb6e14409eea32b5090e9b9cf9f170929fa"} Apr 20 19:17:44.262625 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.262586 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" event={"ID":"ce6fd9d3-3d80-479c-8d81-378eb8d656e9","Type":"ContainerStarted","Data":"b4d0c96568725b4e7cb9aec574ba0007019a57d9c5e6cf30c65d5d109123043a"} Apr 20 19:17:44.263073 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.262694 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" Apr 20 19:17:44.280241 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.280191 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" podStartSLOduration=1.095824278 podStartE2EDuration="3.280178196s" podCreationTimestamp="2026-04-20 19:17:41 +0000 UTC" firstStartedPulling="2026-04-20 19:17:41.566758401 +0000 UTC m=+567.799048887" lastFinishedPulling="2026-04-20 19:17:43.751112316 +0000 UTC m=+569.983402805" observedRunningTime="2026-04-20 19:17:44.278729903 +0000 UTC m=+570.511020411" watchObservedRunningTime="2026-04-20 19:17:44.280178196 +0000 UTC m=+570.512468704" Apr 20 19:17:44.702589 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.702553 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k"] Apr 20 19:17:44.705967 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.705951 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:17:44.709180 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.709158 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-k988g\"" Apr 20 19:17:44.715563 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.715541 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k"] Apr 20 19:17:44.827073 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.827040 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9t9b\" (UniqueName: \"kubernetes.io/projected/835210c1-7c43-47c8-b39a-79af6fd26d30-kube-api-access-g9t9b\") pod \"limitador-operator-controller-manager-85c4996f8c-64v2k\" (UID: \"835210c1-7c43-47c8-b39a-79af6fd26d30\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:17:44.928078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.928046 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9t9b\" (UniqueName: \"kubernetes.io/projected/835210c1-7c43-47c8-b39a-79af6fd26d30-kube-api-access-g9t9b\") pod \"limitador-operator-controller-manager-85c4996f8c-64v2k\" (UID: \"835210c1-7c43-47c8-b39a-79af6fd26d30\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:17:44.944024 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:44.943990 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9t9b\" (UniqueName: \"kubernetes.io/projected/835210c1-7c43-47c8-b39a-79af6fd26d30-kube-api-access-g9t9b\") pod \"limitador-operator-controller-manager-85c4996f8c-64v2k\" (UID: \"835210c1-7c43-47c8-b39a-79af6fd26d30\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:17:45.016761 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:45.016687 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:17:45.148924 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:45.148895 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k"] Apr 20 19:17:45.150830 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:45.150798 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod835210c1_7c43_47c8_b39a_79af6fd26d30.slice/crio-c7d073e19cfe932b85778aded47dbc19b71eefc2beca9569fa768bf929ffabba WatchSource:0}: Error finding container c7d073e19cfe932b85778aded47dbc19b71eefc2beca9569fa768bf929ffabba: Status 404 returned error can't find the container with id c7d073e19cfe932b85778aded47dbc19b71eefc2beca9569fa768bf929ffabba Apr 20 19:17:45.267178 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:45.267074 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" event={"ID":"835210c1-7c43-47c8-b39a-79af6fd26d30","Type":"ContainerStarted","Data":"c7d073e19cfe932b85778aded47dbc19b71eefc2beca9569fa768bf929ffabba"} Apr 20 19:17:47.277376 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:47.277340 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" event={"ID":"835210c1-7c43-47c8-b39a-79af6fd26d30","Type":"ContainerStarted","Data":"0e05f2bdeb282eb157488314a4835380b94789db8ce9bcd21823509c3890980d"} Apr 20 19:17:47.277719 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:47.277427 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:17:47.296239 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:47.296187 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" podStartSLOduration=1.287864138 podStartE2EDuration="3.296174779s" podCreationTimestamp="2026-04-20 19:17:44 +0000 UTC" firstStartedPulling="2026-04-20 19:17:45.152915145 +0000 UTC m=+571.385205649" lastFinishedPulling="2026-04-20 19:17:47.161225799 +0000 UTC m=+573.393516290" observedRunningTime="2026-04-20 19:17:47.295922488 +0000 UTC m=+573.528212996" watchObservedRunningTime="2026-04-20 19:17:47.296174779 +0000 UTC m=+573.528465288" Apr 20 19:17:53.575359 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:53.575327 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rtcdw"] Apr 20 19:17:53.579458 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:53.579435 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" Apr 20 19:17:53.582339 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:53.582300 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-4wzlb\"" Apr 20 19:17:53.590978 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:53.590955 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rtcdw"] Apr 20 19:17:53.709656 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:53.709619 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzjp\" (UniqueName: \"kubernetes.io/projected/9c2ffd5e-2a96-4e32-b4df-6b105a8a5339-kube-api-access-lwzjp\") pod \"authorino-operator-657f44b778-rtcdw\" (UID: \"9c2ffd5e-2a96-4e32-b4df-6b105a8a5339\") " pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" Apr 20 19:17:53.810399 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:53.810338 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzjp\" (UniqueName: \"kubernetes.io/projected/9c2ffd5e-2a96-4e32-b4df-6b105a8a5339-kube-api-access-lwzjp\") pod \"authorino-operator-657f44b778-rtcdw\" (UID: \"9c2ffd5e-2a96-4e32-b4df-6b105a8a5339\") " pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" Apr 20 19:17:53.819463 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:53.819439 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzjp\" (UniqueName: \"kubernetes.io/projected/9c2ffd5e-2a96-4e32-b4df-6b105a8a5339-kube-api-access-lwzjp\") pod \"authorino-operator-657f44b778-rtcdw\" (UID: \"9c2ffd5e-2a96-4e32-b4df-6b105a8a5339\") " pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" Apr 20 19:17:53.891129 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:53.891047 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" Apr 20 19:17:54.024522 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:54.024494 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-rtcdw"] Apr 20 19:17:54.025964 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:17:54.025932 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c2ffd5e_2a96_4e32_b4df_6b105a8a5339.slice/crio-71b46d5b0b3a00dc6a070a8390fe2bcb015dac1c2bae8f663c6070e9f105ca33 WatchSource:0}: Error finding container 71b46d5b0b3a00dc6a070a8390fe2bcb015dac1c2bae8f663c6070e9f105ca33: Status 404 returned error can't find the container with id 71b46d5b0b3a00dc6a070a8390fe2bcb015dac1c2bae8f663c6070e9f105ca33 Apr 20 19:17:54.303276 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:54.303190 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" event={"ID":"9c2ffd5e-2a96-4e32-b4df-6b105a8a5339","Type":"ContainerStarted","Data":"71b46d5b0b3a00dc6a070a8390fe2bcb015dac1c2bae8f663c6070e9f105ca33"} Apr 20 19:17:55.269863 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:55.269834 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-9kfbb" Apr 20 19:17:56.312532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:56.312497 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" event={"ID":"9c2ffd5e-2a96-4e32-b4df-6b105a8a5339","Type":"ContainerStarted","Data":"cbb5caee54231a17a2d99a3d1a8643f586560f67a80c6628c25e5b79a75cb29c"} Apr 20 19:17:56.312880 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:56.312555 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" Apr 20 19:17:58.283433 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:58.283402 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:17:58.302014 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:17:58.301968 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" podStartSLOduration=3.126439832 podStartE2EDuration="5.301954712s" podCreationTimestamp="2026-04-20 19:17:53 +0000 UTC" firstStartedPulling="2026-04-20 19:17:54.02792566 +0000 UTC m=+580.260216147" lastFinishedPulling="2026-04-20 19:17:56.203440539 +0000 UTC m=+582.435731027" observedRunningTime="2026-04-20 19:17:56.336139706 +0000 UTC m=+582.568430213" watchObservedRunningTime="2026-04-20 19:17:58.301954712 +0000 UTC m=+584.534245220" Apr 20 19:18:01.308817 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.308777 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d7f8b87c9-9vhcs" podUID="d17a345d-1269-44bf-93d8-04f0d3d0ded5" containerName="console" containerID="cri-o://e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a" gracePeriod=15 Apr 20 19:18:01.555408 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.555385 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d7f8b87c9-9vhcs_d17a345d-1269-44bf-93d8-04f0d3d0ded5/console/0.log" Apr 20 19:18:01.555530 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.555446 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:18:01.676555 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.676519 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-config\") pod \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " Apr 20 19:18:01.676740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.676565 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-trusted-ca-bundle\") pod \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " Apr 20 19:18:01.676740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.676585 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-oauth-config\") pod \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " Apr 20 19:18:01.676740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.676630 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-serving-cert\") pod \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " Apr 20 19:18:01.676740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.676654 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-oauth-serving-cert\") pod \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " Apr 20 19:18:01.676740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.676680 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzkrm\" (UniqueName: \"kubernetes.io/projected/d17a345d-1269-44bf-93d8-04f0d3d0ded5-kube-api-access-nzkrm\") pod \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " Apr 20 19:18:01.676740 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.676718 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-service-ca\") pod \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\" (UID: \"d17a345d-1269-44bf-93d8-04f0d3d0ded5\") " Apr 20 19:18:01.677049 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.676940 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-config" (OuterVolumeSpecName: "console-config") pod "d17a345d-1269-44bf-93d8-04f0d3d0ded5" (UID: "d17a345d-1269-44bf-93d8-04f0d3d0ded5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:18:01.677113 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.677056 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d17a345d-1269-44bf-93d8-04f0d3d0ded5" (UID: "d17a345d-1269-44bf-93d8-04f0d3d0ded5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:18:01.677194 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.677165 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-service-ca" (OuterVolumeSpecName: "service-ca") pod "d17a345d-1269-44bf-93d8-04f0d3d0ded5" (UID: "d17a345d-1269-44bf-93d8-04f0d3d0ded5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:18:01.677269 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.677247 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d17a345d-1269-44bf-93d8-04f0d3d0ded5" (UID: "d17a345d-1269-44bf-93d8-04f0d3d0ded5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:18:01.679140 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.679108 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17a345d-1269-44bf-93d8-04f0d3d0ded5-kube-api-access-nzkrm" (OuterVolumeSpecName: "kube-api-access-nzkrm") pod "d17a345d-1269-44bf-93d8-04f0d3d0ded5" (UID: "d17a345d-1269-44bf-93d8-04f0d3d0ded5"). InnerVolumeSpecName "kube-api-access-nzkrm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:18:01.679140 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.679120 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d17a345d-1269-44bf-93d8-04f0d3d0ded5" (UID: "d17a345d-1269-44bf-93d8-04f0d3d0ded5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:18:01.679292 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.679173 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d17a345d-1269-44bf-93d8-04f0d3d0ded5" (UID: "d17a345d-1269-44bf-93d8-04f0d3d0ded5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:18:01.777715 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.777677 2583 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:01.777715 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.777709 2583 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-oauth-serving-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:01.777929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.777724 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nzkrm\" (UniqueName: \"kubernetes.io/projected/d17a345d-1269-44bf-93d8-04f0d3d0ded5-kube-api-access-nzkrm\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:01.777929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.777738 2583 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-service-ca\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:01.777929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.777750 2583 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:01.777929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.777761 2583 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d17a345d-1269-44bf-93d8-04f0d3d0ded5-trusted-ca-bundle\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:01.777929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:01.777772 2583 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d17a345d-1269-44bf-93d8-04f0d3d0ded5-console-oauth-config\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:02.335453 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.335424 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d7f8b87c9-9vhcs_d17a345d-1269-44bf-93d8-04f0d3d0ded5/console/0.log" Apr 20 19:18:02.335842 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.335467 2583 generic.go:358] "Generic (PLEG): container finished" podID="d17a345d-1269-44bf-93d8-04f0d3d0ded5" containerID="e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a" exitCode=2 Apr 20 19:18:02.335842 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.335543 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d7f8b87c9-9vhcs" Apr 20 19:18:02.335842 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.335560 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d7f8b87c9-9vhcs" event={"ID":"d17a345d-1269-44bf-93d8-04f0d3d0ded5","Type":"ContainerDied","Data":"e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a"} Apr 20 19:18:02.335842 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.335601 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d7f8b87c9-9vhcs" event={"ID":"d17a345d-1269-44bf-93d8-04f0d3d0ded5","Type":"ContainerDied","Data":"885ee1e151a06ee6fd67db9c104a060aca3e174811d7b73ed8cb195ac6fd894a"} Apr 20 19:18:02.335842 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.335618 2583 scope.go:117] "RemoveContainer" containerID="e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a" Apr 20 19:18:02.344833 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.344814 2583 scope.go:117] "RemoveContainer" containerID="e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a" Apr 20 19:18:02.345075 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:18:02.345058 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a\": container with ID starting with e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a not found: ID does not exist" containerID="e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a" Apr 20 19:18:02.345125 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.345083 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a"} err="failed to get container status \"e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a\": rpc error: code = NotFound desc = could not find container \"e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a\": container with ID starting with e2a642ce513ba66e8029f28f9ffaeb8e42e9f07804bec142c71d98472b04181a not found: ID does not exist" Apr 20 19:18:02.361979 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.361945 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d7f8b87c9-9vhcs"] Apr 20 19:18:02.361979 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:02.361979 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d7f8b87c9-9vhcs"] Apr 20 19:18:04.360802 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:04.360771 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17a345d-1269-44bf-93d8-04f0d3d0ded5" path="/var/lib/kubelet/pods/d17a345d-1269-44bf-93d8-04f0d3d0ded5/volumes" Apr 20 19:18:07.318062 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:07.318029 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-rtcdw" Apr 20 19:18:09.226594 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.226554 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k"] Apr 20 19:18:09.227042 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.226851 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" podUID="835210c1-7c43-47c8-b39a-79af6fd26d30" containerName="manager" containerID="cri-o://0e05f2bdeb282eb157488314a4835380b94789db8ce9bcd21823509c3890980d" gracePeriod=2 Apr 20 19:18:09.251288 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.251260 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k"] Apr 20 19:18:09.268578 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.268549 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw"] Apr 20 19:18:09.269028 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.269001 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="835210c1-7c43-47c8-b39a-79af6fd26d30" containerName="manager" Apr 20 19:18:09.269028 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.269022 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="835210c1-7c43-47c8-b39a-79af6fd26d30" containerName="manager" Apr 20 19:18:09.269173 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.269033 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d17a345d-1269-44bf-93d8-04f0d3d0ded5" containerName="console" Apr 20 19:18:09.269173 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.269044 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17a345d-1269-44bf-93d8-04f0d3d0ded5" containerName="console" Apr 20 19:18:09.269173 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.269096 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="835210c1-7c43-47c8-b39a-79af6fd26d30" containerName="manager" Apr 20 19:18:09.269173 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.269105 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="d17a345d-1269-44bf-93d8-04f0d3d0ded5" containerName="console" Apr 20 19:18:09.271717 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.271696 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" Apr 20 19:18:09.299897 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.299829 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw"] Apr 20 19:18:09.365743 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.365711 2583 generic.go:358] "Generic (PLEG): container finished" podID="835210c1-7c43-47c8-b39a-79af6fd26d30" containerID="0e05f2bdeb282eb157488314a4835380b94789db8ce9bcd21823509c3890980d" exitCode=0 Apr 20 19:18:09.442145 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.442114 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4vcn\" (UniqueName: \"kubernetes.io/projected/2e8d84a0-947d-4f9c-82a5-340fe34abf01-kube-api-access-v4vcn\") pod \"limitador-operator-controller-manager-85c4996f8c-gbrtw\" (UID: \"2e8d84a0-947d-4f9c-82a5-340fe34abf01\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" Apr 20 19:18:09.461597 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.461575 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:18:09.464189 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.464152 2583 status_manager.go:895] "Failed to get status for pod" podUID="835210c1-7c43-47c8-b39a-79af6fd26d30" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" err="pods \"limitador-operator-controller-manager-85c4996f8c-64v2k\" is forbidden: User \"system:node:ip-10-0-139-126.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-126.ec2.internal' and this object" Apr 20 19:18:09.543251 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.543156 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4vcn\" (UniqueName: \"kubernetes.io/projected/2e8d84a0-947d-4f9c-82a5-340fe34abf01-kube-api-access-v4vcn\") pod \"limitador-operator-controller-manager-85c4996f8c-gbrtw\" (UID: \"2e8d84a0-947d-4f9c-82a5-340fe34abf01\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" Apr 20 19:18:09.556394 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.556364 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4vcn\" (UniqueName: \"kubernetes.io/projected/2e8d84a0-947d-4f9c-82a5-340fe34abf01-kube-api-access-v4vcn\") pod \"limitador-operator-controller-manager-85c4996f8c-gbrtw\" (UID: \"2e8d84a0-947d-4f9c-82a5-340fe34abf01\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" Apr 20 19:18:09.603279 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.603251 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" Apr 20 19:18:09.644534 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.644500 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9t9b\" (UniqueName: \"kubernetes.io/projected/835210c1-7c43-47c8-b39a-79af6fd26d30-kube-api-access-g9t9b\") pod \"835210c1-7c43-47c8-b39a-79af6fd26d30\" (UID: \"835210c1-7c43-47c8-b39a-79af6fd26d30\") " Apr 20 19:18:09.646635 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.646602 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835210c1-7c43-47c8-b39a-79af6fd26d30-kube-api-access-g9t9b" (OuterVolumeSpecName: "kube-api-access-g9t9b") pod "835210c1-7c43-47c8-b39a-79af6fd26d30" (UID: "835210c1-7c43-47c8-b39a-79af6fd26d30"). InnerVolumeSpecName "kube-api-access-g9t9b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:18:09.741614 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.741582 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw"] Apr 20 19:18:09.742646 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:18:09.742609 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e8d84a0_947d_4f9c_82a5_340fe34abf01.slice/crio-441777819faec0c790dfa8416f4ae8fa05320f13b90e2e9863bd687698a1bfc1 WatchSource:0}: Error finding container 441777819faec0c790dfa8416f4ae8fa05320f13b90e2e9863bd687698a1bfc1: Status 404 returned error can't find the container with id 441777819faec0c790dfa8416f4ae8fa05320f13b90e2e9863bd687698a1bfc1 Apr 20 19:18:09.745341 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:09.745300 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g9t9b\" (UniqueName: \"kubernetes.io/projected/835210c1-7c43-47c8-b39a-79af6fd26d30-kube-api-access-g9t9b\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:10.364444 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:10.362634 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835210c1-7c43-47c8-b39a-79af6fd26d30" path="/var/lib/kubelet/pods/835210c1-7c43-47c8-b39a-79af6fd26d30/volumes" Apr 20 19:18:10.372136 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:10.372100 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" event={"ID":"2e8d84a0-947d-4f9c-82a5-340fe34abf01","Type":"ContainerStarted","Data":"bd3ed3e6837dc93e04a120ad5d6e0eab420eb3904e57dcc595bc80436a15a863"} Apr 20 19:18:10.372322 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:10.372142 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" event={"ID":"2e8d84a0-947d-4f9c-82a5-340fe34abf01","Type":"ContainerStarted","Data":"441777819faec0c790dfa8416f4ae8fa05320f13b90e2e9863bd687698a1bfc1"} Apr 20 19:18:10.372322 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:10.372207 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" Apr 20 19:18:10.373490 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:10.373465 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-64v2k" Apr 20 19:18:10.373490 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:10.373481 2583 scope.go:117] "RemoveContainer" containerID="0e05f2bdeb282eb157488314a4835380b94789db8ce9bcd21823509c3890980d" Apr 20 19:18:14.270012 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:14.269986 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:18:14.270470 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:14.270042 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:18:21.379945 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:21.379908 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" Apr 20 19:18:21.399936 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:21.399887 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-gbrtw" podStartSLOduration=12.399871381 podStartE2EDuration="12.399871381s" podCreationTimestamp="2026-04-20 19:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:18:10.390519 +0000 UTC m=+596.622809509" watchObservedRunningTime="2026-04-20 19:18:21.399871381 +0000 UTC m=+607.632161889" Apr 20 19:18:37.891580 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.891363 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt"] Apr 20 19:18:37.894122 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.894090 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.896921 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.896883 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-56mlx\"" Apr 20 19:18:37.911409 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.911384 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt"] Apr 20 19:18:37.941214 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941184 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/772cb982-87f2-4078-9b45-47fd619a29ac-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.941363 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941235 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/772cb982-87f2-4078-9b45-47fd619a29ac-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.941363 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941262 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qb6j\" (UniqueName: \"kubernetes.io/projected/772cb982-87f2-4078-9b45-47fd619a29ac-kube-api-access-9qb6j\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.941363 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941324 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.941363 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941350 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.941506 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941368 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.941506 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941386 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.941506 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941453 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/772cb982-87f2-4078-9b45-47fd619a29ac-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:37.941506 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:37.941482 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042267 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042231 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042472 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042281 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/772cb982-87f2-4078-9b45-47fd619a29ac-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042472 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042346 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/772cb982-87f2-4078-9b45-47fd619a29ac-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042472 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042371 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qb6j\" (UniqueName: \"kubernetes.io/projected/772cb982-87f2-4078-9b45-47fd619a29ac-kube-api-access-9qb6j\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042472 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042434 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042699 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042527 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042699 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042574 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042699 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042606 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042699 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042655 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/772cb982-87f2-4078-9b45-47fd619a29ac-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042900 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042797 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.042900 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.042886 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.043087 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.043067 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.043183 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.043162 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.043232 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.043168 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/772cb982-87f2-4078-9b45-47fd619a29ac-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.044837 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.044816 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/772cb982-87f2-4078-9b45-47fd619a29ac-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.045177 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.045158 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/772cb982-87f2-4078-9b45-47fd619a29ac-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.055199 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.055176 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qb6j\" (UniqueName: \"kubernetes.io/projected/772cb982-87f2-4078-9b45-47fd619a29ac-kube-api-access-9qb6j\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.055423 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.055399 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/772cb982-87f2-4078-9b45-47fd619a29ac-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-f4xmt\" (UID: \"772cb982-87f2-4078-9b45-47fd619a29ac\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.205888 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.205803 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:38.337764 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.337735 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt"] Apr 20 19:18:38.338453 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:18:38.338425 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772cb982_87f2_4078_9b45_47fd619a29ac.slice/crio-0dc2b8822c76e955b2451c02874c1cb022dc09da25052bba8fa13c42c59c52bb WatchSource:0}: Error finding container 0dc2b8822c76e955b2451c02874c1cb022dc09da25052bba8fa13c42c59c52bb: Status 404 returned error can't find the container with id 0dc2b8822c76e955b2451c02874c1cb022dc09da25052bba8fa13c42c59c52bb Apr 20 19:18:38.340707 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.340675 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 19:18:38.340811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.340756 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 19:18:38.340811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.340799 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 19:18:38.482773 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.482690 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" event={"ID":"772cb982-87f2-4078-9b45-47fd619a29ac","Type":"ContainerStarted","Data":"26b9cdcccf0e783b16876ff3260cae9972bdcde27f847c2dd7420ff6a05a735f"} Apr 20 19:18:38.482773 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.482727 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" event={"ID":"772cb982-87f2-4078-9b45-47fd619a29ac","Type":"ContainerStarted","Data":"0dc2b8822c76e955b2451c02874c1cb022dc09da25052bba8fa13c42c59c52bb"} Apr 20 19:18:38.506301 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:38.506249 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" podStartSLOduration=1.5062343679999999 podStartE2EDuration="1.506234368s" podCreationTimestamp="2026-04-20 19:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:18:38.503479208 +0000 UTC m=+624.735769711" watchObservedRunningTime="2026-04-20 19:18:38.506234368 +0000 UTC m=+624.738524878" Apr 20 19:18:39.207021 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:39.206991 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:39.212211 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:39.212188 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:39.487068 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:39.486993 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:39.488130 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:39.488112 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-f4xmt" Apr 20 19:18:52.205287 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.205252 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l2w2r"] Apr 20 19:18:52.208248 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.208227 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" Apr 20 19:18:52.210787 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.210765 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nb9rv\"" Apr 20 19:18:52.215374 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.215349 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l2w2r"] Apr 20 19:18:52.255634 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.255604 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wj57\" (UniqueName: \"kubernetes.io/projected/2c3ad98b-a961-4b7d-bedf-fe72b9f1692e-kube-api-access-4wj57\") pod \"authorino-f99f4b5cd-l2w2r\" (UID: \"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e\") " pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" Apr 20 19:18:52.356765 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.356729 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wj57\" (UniqueName: \"kubernetes.io/projected/2c3ad98b-a961-4b7d-bedf-fe72b9f1692e-kube-api-access-4wj57\") pod \"authorino-f99f4b5cd-l2w2r\" (UID: \"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e\") " pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" Apr 20 19:18:52.365061 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.365027 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wj57\" (UniqueName: \"kubernetes.io/projected/2c3ad98b-a961-4b7d-bedf-fe72b9f1692e-kube-api-access-4wj57\") pod \"authorino-f99f4b5cd-l2w2r\" (UID: \"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e\") " pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" Apr 20 19:18:52.519887 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.519800 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" Apr 20 19:18:52.649671 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.649645 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l2w2r"] Apr 20 19:18:52.651954 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:18:52.651918 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c3ad98b_a961_4b7d_bedf_fe72b9f1692e.slice/crio-8ca7447fb99df9fba528e57908b396e6cf8bc519f6b206011d11b867167d2009 WatchSource:0}: Error finding container 8ca7447fb99df9fba528e57908b396e6cf8bc519f6b206011d11b867167d2009: Status 404 returned error can't find the container with id 8ca7447fb99df9fba528e57908b396e6cf8bc519f6b206011d11b867167d2009 Apr 20 19:18:52.653580 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:52.653559 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:18:53.542771 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:53.542734 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" event={"ID":"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e","Type":"ContainerStarted","Data":"8ca7447fb99df9fba528e57908b396e6cf8bc519f6b206011d11b867167d2009"} Apr 20 19:18:56.556937 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:56.556896 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" event={"ID":"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e","Type":"ContainerStarted","Data":"88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb"} Apr 20 19:18:56.572983 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:56.572934 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" podStartSLOduration=1.159712614 podStartE2EDuration="4.572919608s" podCreationTimestamp="2026-04-20 19:18:52 +0000 UTC" firstStartedPulling="2026-04-20 19:18:52.653715191 +0000 UTC m=+638.886005677" lastFinishedPulling="2026-04-20 19:18:56.066922185 +0000 UTC m=+642.299212671" observedRunningTime="2026-04-20 19:18:56.571685544 +0000 UTC m=+642.803976057" watchObservedRunningTime="2026-04-20 19:18:56.572919608 +0000 UTC m=+642.805210166" Apr 20 19:18:57.857025 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:57.856986 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l2w2r"] Apr 20 19:18:58.564513 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:58.564467 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" podUID="2c3ad98b-a961-4b7d-bedf-fe72b9f1692e" containerName="authorino" containerID="cri-o://88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb" gracePeriod=30 Apr 20 19:18:58.810832 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:58.810812 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" Apr 20 19:18:58.914488 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:58.914455 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wj57\" (UniqueName: \"kubernetes.io/projected/2c3ad98b-a961-4b7d-bedf-fe72b9f1692e-kube-api-access-4wj57\") pod \"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e\" (UID: \"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e\") " Apr 20 19:18:58.916643 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:58.916620 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3ad98b-a961-4b7d-bedf-fe72b9f1692e-kube-api-access-4wj57" (OuterVolumeSpecName: "kube-api-access-4wj57") pod "2c3ad98b-a961-4b7d-bedf-fe72b9f1692e" (UID: "2c3ad98b-a961-4b7d-bedf-fe72b9f1692e"). InnerVolumeSpecName "kube-api-access-4wj57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:18:59.015096 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.015047 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4wj57\" (UniqueName: \"kubernetes.io/projected/2c3ad98b-a961-4b7d-bedf-fe72b9f1692e-kube-api-access-4wj57\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:18:59.568906 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.568868 2583 generic.go:358] "Generic (PLEG): container finished" podID="2c3ad98b-a961-4b7d-bedf-fe72b9f1692e" containerID="88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb" exitCode=0 Apr 20 19:18:59.569105 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.568926 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" event={"ID":"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e","Type":"ContainerDied","Data":"88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb"} Apr 20 19:18:59.569105 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.568935 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" Apr 20 19:18:59.569105 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.568960 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-l2w2r" event={"ID":"2c3ad98b-a961-4b7d-bedf-fe72b9f1692e","Type":"ContainerDied","Data":"8ca7447fb99df9fba528e57908b396e6cf8bc519f6b206011d11b867167d2009"} Apr 20 19:18:59.569105 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.568976 2583 scope.go:117] "RemoveContainer" containerID="88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb" Apr 20 19:18:59.578442 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.578394 2583 scope.go:117] "RemoveContainer" containerID="88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb" Apr 20 19:18:59.578733 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:18:59.578712 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb\": container with ID starting with 88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb not found: ID does not exist" containerID="88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb" Apr 20 19:18:59.578795 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.578743 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb"} err="failed to get container status \"88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb\": rpc error: code = NotFound desc = could not find container \"88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb\": container with ID starting with 88f7200305cea564d27c11ac103e13b00bcb2335d62f67d993a7c4b8306febbb not found: ID does not exist" Apr 20 19:18:59.590919 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.590895 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l2w2r"] Apr 20 19:18:59.592922 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:18:59.592904 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-l2w2r"] Apr 20 19:19:00.362048 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:00.362021 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3ad98b-a961-4b7d-bedf-fe72b9f1692e" path="/var/lib/kubelet/pods/2c3ad98b-a961-4b7d-bedf-fe72b9f1692e/volumes" Apr 20 19:19:26.348999 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.348965 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wzm5k"] Apr 20 19:19:26.349394 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.349332 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c3ad98b-a961-4b7d-bedf-fe72b9f1692e" containerName="authorino" Apr 20 19:19:26.349394 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.349344 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3ad98b-a961-4b7d-bedf-fe72b9f1692e" containerName="authorino" Apr 20 19:19:26.349471 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.349398 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c3ad98b-a961-4b7d-bedf-fe72b9f1692e" containerName="authorino" Apr 20 19:19:26.356502 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.356477 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" Apr 20 19:19:26.359768 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.359724 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nb9rv\"" Apr 20 19:19:26.362677 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.362656 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wzm5k"] Apr 20 19:19:26.447595 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.447563 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ms7s\" (UniqueName: \"kubernetes.io/projected/453527cd-fc55-41ae-8048-a9ae2ae8064a-kube-api-access-2ms7s\") pod \"authorino-8b475cf9f-wzm5k\" (UID: \"453527cd-fc55-41ae-8048-a9ae2ae8064a\") " pod="kuadrant-system/authorino-8b475cf9f-wzm5k" Apr 20 19:19:26.548664 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.548630 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ms7s\" (UniqueName: \"kubernetes.io/projected/453527cd-fc55-41ae-8048-a9ae2ae8064a-kube-api-access-2ms7s\") pod \"authorino-8b475cf9f-wzm5k\" (UID: \"453527cd-fc55-41ae-8048-a9ae2ae8064a\") " pod="kuadrant-system/authorino-8b475cf9f-wzm5k" Apr 20 19:19:26.558933 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.558905 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ms7s\" (UniqueName: \"kubernetes.io/projected/453527cd-fc55-41ae-8048-a9ae2ae8064a-kube-api-access-2ms7s\") pod \"authorino-8b475cf9f-wzm5k\" (UID: \"453527cd-fc55-41ae-8048-a9ae2ae8064a\") " pod="kuadrant-system/authorino-8b475cf9f-wzm5k" Apr 20 19:19:26.587550 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.587518 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wzm5k"] Apr 20 19:19:26.587793 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.587777 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" Apr 20 19:19:26.617612 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.617579 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7d67bc7974-k5jpg"] Apr 20 19:19:26.623637 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.623247 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d67bc7974-k5jpg" Apr 20 19:19:26.626795 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.626771 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d67bc7974-k5jpg"] Apr 20 19:19:26.687444 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.687416 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7d67bc7974-k5jpg"] Apr 20 19:19:26.687678 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:19:26.687658 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cdx7q], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-7d67bc7974-k5jpg" podUID="efaf78fc-2ad9-4303-bd54-1bd2c8546841" Apr 20 19:19:26.713298 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.713264 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-748f7b8d66-bdw2z"] Apr 20 19:19:26.717004 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.716989 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:19:26.719669 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.719649 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 19:19:26.726659 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.726606 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-748f7b8d66-bdw2z"] Apr 20 19:19:26.729296 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:19:26.729275 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod453527cd_fc55_41ae_8048_a9ae2ae8064a.slice/crio-a9abe4fca0fb9cbf8079486f13e271f15b9213e728c8a4dda6f43189c94fb735 WatchSource:0}: Error finding container a9abe4fca0fb9cbf8079486f13e271f15b9213e728c8a4dda6f43189c94fb735: Status 404 returned error can't find the container with id a9abe4fca0fb9cbf8079486f13e271f15b9213e728c8a4dda6f43189c94fb735 Apr 20 19:19:26.730237 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.730219 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wzm5k"] Apr 20 19:19:26.751224 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.751192 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdx7q\" (UniqueName: \"kubernetes.io/projected/efaf78fc-2ad9-4303-bd54-1bd2c8546841-kube-api-access-cdx7q\") pod \"authorino-7d67bc7974-k5jpg\" (UID: \"efaf78fc-2ad9-4303-bd54-1bd2c8546841\") " pod="kuadrant-system/authorino-7d67bc7974-k5jpg" Apr 20 19:19:26.852604 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.852526 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdx7q\" (UniqueName: \"kubernetes.io/projected/efaf78fc-2ad9-4303-bd54-1bd2c8546841-kube-api-access-cdx7q\") pod \"authorino-7d67bc7974-k5jpg\" (UID: \"efaf78fc-2ad9-4303-bd54-1bd2c8546841\") " pod="kuadrant-system/authorino-7d67bc7974-k5jpg" Apr 20 19:19:26.852604 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.852568 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhxlw\" (UniqueName: \"kubernetes.io/projected/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-kube-api-access-nhxlw\") pod \"authorino-748f7b8d66-bdw2z\" (UID: \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\") " pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:19:26.852604 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.852601 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-tls-cert\") pod \"authorino-748f7b8d66-bdw2z\" (UID: \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\") " pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:19:26.861241 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.861205 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdx7q\" (UniqueName: \"kubernetes.io/projected/efaf78fc-2ad9-4303-bd54-1bd2c8546841-kube-api-access-cdx7q\") pod \"authorino-7d67bc7974-k5jpg\" (UID: \"efaf78fc-2ad9-4303-bd54-1bd2c8546841\") " pod="kuadrant-system/authorino-7d67bc7974-k5jpg" Apr 20 19:19:26.953142 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.953102 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhxlw\" (UniqueName: \"kubernetes.io/projected/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-kube-api-access-nhxlw\") pod \"authorino-748f7b8d66-bdw2z\" (UID: \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\") " pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:19:26.953142 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.953145 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-tls-cert\") pod \"authorino-748f7b8d66-bdw2z\" (UID: \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\") " pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:19:26.955780 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.955751 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-tls-cert\") pod \"authorino-748f7b8d66-bdw2z\" (UID: \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\") " pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:19:26.961760 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:26.961734 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhxlw\" (UniqueName: \"kubernetes.io/projected/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-kube-api-access-nhxlw\") pod \"authorino-748f7b8d66-bdw2z\" (UID: \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\") " pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:19:27.026715 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.026684 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:19:27.155135 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.155105 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-748f7b8d66-bdw2z"] Apr 20 19:19:27.156902 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:19:27.156875 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod600058fc_4f3f_48a4_a74b_3f2ae38bedcd.slice/crio-31eef648b17e7d067cb5d25e593720977d443bf78bc28d44b68eb9a9a5b3b8b6 WatchSource:0}: Error finding container 31eef648b17e7d067cb5d25e593720977d443bf78bc28d44b68eb9a9a5b3b8b6: Status 404 returned error can't find the container with id 31eef648b17e7d067cb5d25e593720977d443bf78bc28d44b68eb9a9a5b3b8b6 Apr 20 19:19:27.686125 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.686092 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" event={"ID":"600058fc-4f3f-48a4-a74b-3f2ae38bedcd","Type":"ContainerStarted","Data":"aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e"} Apr 20 19:19:27.686125 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.686127 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" event={"ID":"600058fc-4f3f-48a4-a74b-3f2ae38bedcd","Type":"ContainerStarted","Data":"31eef648b17e7d067cb5d25e593720977d443bf78bc28d44b68eb9a9a5b3b8b6"} Apr 20 19:19:27.687537 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.687514 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" event={"ID":"453527cd-fc55-41ae-8048-a9ae2ae8064a","Type":"ContainerStarted","Data":"e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1"} Apr 20 19:19:27.687667 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.687550 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" event={"ID":"453527cd-fc55-41ae-8048-a9ae2ae8064a","Type":"ContainerStarted","Data":"a9abe4fca0fb9cbf8079486f13e271f15b9213e728c8a4dda6f43189c94fb735"} Apr 20 19:19:27.687667 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.687553 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" podUID="453527cd-fc55-41ae-8048-a9ae2ae8064a" containerName="authorino" containerID="cri-o://e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1" gracePeriod=30 Apr 20 19:19:27.687667 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.687584 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d67bc7974-k5jpg" Apr 20 19:19:27.693222 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.693203 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d67bc7974-k5jpg" Apr 20 19:19:27.704677 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.704636 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" podStartSLOduration=1.3806678589999999 podStartE2EDuration="1.704623621s" podCreationTimestamp="2026-04-20 19:19:26 +0000 UTC" firstStartedPulling="2026-04-20 19:19:27.158489271 +0000 UTC m=+673.390779760" lastFinishedPulling="2026-04-20 19:19:27.482445024 +0000 UTC m=+673.714735522" observedRunningTime="2026-04-20 19:19:27.703385519 +0000 UTC m=+673.935676021" watchObservedRunningTime="2026-04-20 19:19:27.704623621 +0000 UTC m=+673.936914136" Apr 20 19:19:27.721885 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.721841 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" podStartSLOduration=1.397431948 podStartE2EDuration="1.721826597s" podCreationTimestamp="2026-04-20 19:19:26 +0000 UTC" firstStartedPulling="2026-04-20 19:19:26.731073777 +0000 UTC m=+672.963364262" lastFinishedPulling="2026-04-20 19:19:27.055468422 +0000 UTC m=+673.287758911" observedRunningTime="2026-04-20 19:19:27.720725529 +0000 UTC m=+673.953016050" watchObservedRunningTime="2026-04-20 19:19:27.721826597 +0000 UTC m=+673.954117104" Apr 20 19:19:27.861793 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.861748 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdx7q\" (UniqueName: \"kubernetes.io/projected/efaf78fc-2ad9-4303-bd54-1bd2c8546841-kube-api-access-cdx7q\") pod \"efaf78fc-2ad9-4303-bd54-1bd2c8546841\" (UID: \"efaf78fc-2ad9-4303-bd54-1bd2c8546841\") " Apr 20 19:19:27.866070 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.866040 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efaf78fc-2ad9-4303-bd54-1bd2c8546841-kube-api-access-cdx7q" (OuterVolumeSpecName: "kube-api-access-cdx7q") pod "efaf78fc-2ad9-4303-bd54-1bd2c8546841" (UID: "efaf78fc-2ad9-4303-bd54-1bd2c8546841"). InnerVolumeSpecName "kube-api-access-cdx7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:19:27.934038 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.934017 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" Apr 20 19:19:27.963109 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:27.963078 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cdx7q\" (UniqueName: \"kubernetes.io/projected/efaf78fc-2ad9-4303-bd54-1bd2c8546841-kube-api-access-cdx7q\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:19:28.064172 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.064086 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ms7s\" (UniqueName: \"kubernetes.io/projected/453527cd-fc55-41ae-8048-a9ae2ae8064a-kube-api-access-2ms7s\") pod \"453527cd-fc55-41ae-8048-a9ae2ae8064a\" (UID: \"453527cd-fc55-41ae-8048-a9ae2ae8064a\") " Apr 20 19:19:28.066349 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.066323 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453527cd-fc55-41ae-8048-a9ae2ae8064a-kube-api-access-2ms7s" (OuterVolumeSpecName: "kube-api-access-2ms7s") pod "453527cd-fc55-41ae-8048-a9ae2ae8064a" (UID: "453527cd-fc55-41ae-8048-a9ae2ae8064a"). InnerVolumeSpecName "kube-api-access-2ms7s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:19:28.164946 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.164905 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2ms7s\" (UniqueName: \"kubernetes.io/projected/453527cd-fc55-41ae-8048-a9ae2ae8064a-kube-api-access-2ms7s\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:19:28.692532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.692437 2583 generic.go:358] "Generic (PLEG): container finished" podID="453527cd-fc55-41ae-8048-a9ae2ae8064a" containerID="e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1" exitCode=0 Apr 20 19:19:28.692532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.692486 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" Apr 20 19:19:28.692532 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.692519 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" event={"ID":"453527cd-fc55-41ae-8048-a9ae2ae8064a","Type":"ContainerDied","Data":"e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1"} Apr 20 19:19:28.693078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.692554 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-wzm5k" event={"ID":"453527cd-fc55-41ae-8048-a9ae2ae8064a","Type":"ContainerDied","Data":"a9abe4fca0fb9cbf8079486f13e271f15b9213e728c8a4dda6f43189c94fb735"} Apr 20 19:19:28.693078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.692575 2583 scope.go:117] "RemoveContainer" containerID="e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1" Apr 20 19:19:28.693078 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.692747 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d67bc7974-k5jpg" Apr 20 19:19:28.701500 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.701482 2583 scope.go:117] "RemoveContainer" containerID="e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1" Apr 20 19:19:28.704244 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:19:28.704200 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1\": container with ID starting with e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1 not found: ID does not exist" containerID="e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1" Apr 20 19:19:28.704378 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.704235 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1"} err="failed to get container status \"e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1\": rpc error: code = NotFound desc = could not find container \"e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1\": container with ID starting with e78f9a520c5772fd7816a61b5e332e2249864212d6b8424f246a82e512e96ab1 not found: ID does not exist" Apr 20 19:19:28.712567 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.712544 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wzm5k"] Apr 20 19:19:28.722934 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.722905 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-wzm5k"] Apr 20 19:19:28.750501 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.750472 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7d67bc7974-k5jpg"] Apr 20 19:19:28.757266 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:28.757236 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7d67bc7974-k5jpg"] Apr 20 19:19:30.366741 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:30.366706 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453527cd-fc55-41ae-8048-a9ae2ae8064a" path="/var/lib/kubelet/pods/453527cd-fc55-41ae-8048-a9ae2ae8064a/volumes" Apr 20 19:19:30.367113 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:19:30.367018 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efaf78fc-2ad9-4303-bd54-1bd2c8546841" path="/var/lib/kubelet/pods/efaf78fc-2ad9-4303-bd54-1bd2c8546841/volumes" Apr 20 19:20:29.977820 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.977783 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm"] Apr 20 19:20:29.978296 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.978176 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="453527cd-fc55-41ae-8048-a9ae2ae8064a" containerName="authorino" Apr 20 19:20:29.978296 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.978189 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="453527cd-fc55-41ae-8048-a9ae2ae8064a" containerName="authorino" Apr 20 19:20:29.978296 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.978259 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="453527cd-fc55-41ae-8048-a9ae2ae8064a" containerName="authorino" Apr 20 19:20:29.982696 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.982675 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:29.985979 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.985954 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 19:20:29.987511 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.987490 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 19:20:29.987608 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.987526 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 19:20:29.987608 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.987596 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-lf8z7\"" Apr 20 19:20:29.995097 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:29.995075 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm"] Apr 20 19:20:30.011474 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.011443 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.011603 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.011487 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95a9da12-a31b-455c-b073-3506ba305a8f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.011603 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.011551 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.011603 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.011580 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs928\" (UniqueName: \"kubernetes.io/projected/95a9da12-a31b-455c-b073-3506ba305a8f-kube-api-access-hs928\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.011755 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.011634 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.011755 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.011731 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112436 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112381 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112463 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112481 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95a9da12-a31b-455c-b073-3506ba305a8f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112498 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112515 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs928\" (UniqueName: \"kubernetes.io/projected/95a9da12-a31b-455c-b073-3506ba305a8f-kube-api-access-hs928\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112654 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112567 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112928 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112883 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112982 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112891 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.112982 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.112959 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.114909 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.114878 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/95a9da12-a31b-455c-b073-3506ba305a8f-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.115331 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.115292 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/95a9da12-a31b-455c-b073-3506ba305a8f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.123783 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.123755 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs928\" (UniqueName: \"kubernetes.io/projected/95a9da12-a31b-455c-b073-3506ba305a8f-kube-api-access-hs928\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm\" (UID: \"95a9da12-a31b-455c-b073-3506ba305a8f\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.294447 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.294358 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:30.425901 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.425873 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm"] Apr 20 19:20:30.427350 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:20:30.427322 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a9da12_a31b_455c_b073_3506ba305a8f.slice/crio-255d05fcb055aeb8753d9947ce34eaf880ab1ff91e27804438ac207798b215af WatchSource:0}: Error finding container 255d05fcb055aeb8753d9947ce34eaf880ab1ff91e27804438ac207798b215af: Status 404 returned error can't find the container with id 255d05fcb055aeb8753d9947ce34eaf880ab1ff91e27804438ac207798b215af Apr 20 19:20:30.948395 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:30.948348 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" event={"ID":"95a9da12-a31b-455c-b073-3506ba305a8f","Type":"ContainerStarted","Data":"255d05fcb055aeb8753d9947ce34eaf880ab1ff91e27804438ac207798b215af"} Apr 20 19:20:35.971676 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:35.971643 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" event={"ID":"95a9da12-a31b-455c-b073-3506ba305a8f","Type":"ContainerStarted","Data":"22c531cce031ffdb29f1be03bbdab1c6eaf4b0257930581dbcbfe5062e16647e"} Apr 20 19:20:41.001206 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:41.001174 2583 generic.go:358] "Generic (PLEG): container finished" podID="95a9da12-a31b-455c-b073-3506ba305a8f" containerID="22c531cce031ffdb29f1be03bbdab1c6eaf4b0257930581dbcbfe5062e16647e" exitCode=0 Apr 20 19:20:41.001660 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:41.001251 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" event={"ID":"95a9da12-a31b-455c-b073-3506ba305a8f","Type":"ContainerDied","Data":"22c531cce031ffdb29f1be03bbdab1c6eaf4b0257930581dbcbfe5062e16647e"} Apr 20 19:20:43.011276 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:43.011240 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" event={"ID":"95a9da12-a31b-455c-b073-3506ba305a8f","Type":"ContainerStarted","Data":"54355bb38e3d134b8d07e6a67dc6fbc42a3d0ba4c9b8202e79edb50f805ebbb8"} Apr 20 19:20:43.011676 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:43.011507 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:20:43.032593 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:43.032551 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" podStartSLOduration=2.370012971 podStartE2EDuration="14.03253738s" podCreationTimestamp="2026-04-20 19:20:29 +0000 UTC" firstStartedPulling="2026-04-20 19:20:30.429686605 +0000 UTC m=+736.661977091" lastFinishedPulling="2026-04-20 19:20:42.092211004 +0000 UTC m=+748.324501500" observedRunningTime="2026-04-20 19:20:43.029773058 +0000 UTC m=+749.262063564" watchObservedRunningTime="2026-04-20 19:20:43.03253738 +0000 UTC m=+749.264827888" Apr 20 19:20:54.028679 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:20:54.028649 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm" Apr 20 19:21:10.666763 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.666724 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q"] Apr 20 19:21:10.669583 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.669563 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.672753 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.672724 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 19:21:10.680852 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.680828 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q"] Apr 20 19:21:10.691063 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.691030 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.691176 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.691137 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.691232 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.691186 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.691232 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.691225 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd8c972-6266-4aa2-893c-8db4ba13e971-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.691341 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.691302 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.691392 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.691376 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstkd\" (UniqueName: \"kubernetes.io/projected/7bd8c972-6266-4aa2-893c-8db4ba13e971-kube-api-access-jstkd\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.792424 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.792392 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.792628 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.792430 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.792628 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.792504 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd8c972-6266-4aa2-893c-8db4ba13e971-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.792628 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.792576 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.792802 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.792627 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jstkd\" (UniqueName: \"kubernetes.io/projected/7bd8c972-6266-4aa2-893c-8db4ba13e971-kube-api-access-jstkd\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.792802 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.792667 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.792906 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.792849 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.792906 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.792893 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.793018 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.793001 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.794951 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.794928 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7bd8c972-6266-4aa2-893c-8db4ba13e971-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.795066 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.795021 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bd8c972-6266-4aa2-893c-8db4ba13e971-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.802693 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.802671 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstkd\" (UniqueName: \"kubernetes.io/projected/7bd8c972-6266-4aa2-893c-8db4ba13e971-kube-api-access-jstkd\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-f6h6q\" (UID: \"7bd8c972-6266-4aa2-893c-8db4ba13e971\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:10.980456 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:10.980369 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:11.131320 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:11.131269 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q"] Apr 20 19:21:11.131688 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:21:11.131659 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bd8c972_6266_4aa2_893c_8db4ba13e971.slice/crio-74a76291d378cc808432029b9bd8cded99dcb28df4126a93fa5d7ca0b384e482 WatchSource:0}: Error finding container 74a76291d378cc808432029b9bd8cded99dcb28df4126a93fa5d7ca0b384e482: Status 404 returned error can't find the container with id 74a76291d378cc808432029b9bd8cded99dcb28df4126a93fa5d7ca0b384e482 Apr 20 19:21:12.126215 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:12.126178 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" event={"ID":"7bd8c972-6266-4aa2-893c-8db4ba13e971","Type":"ContainerStarted","Data":"9b4cce7aa7e5258b90b4fdb7c51923acc6c19b279c4fe2c8e9d779947049a932"} Apr 20 19:21:12.126215 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:12.126216 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" event={"ID":"7bd8c972-6266-4aa2-893c-8db4ba13e971","Type":"ContainerStarted","Data":"74a76291d378cc808432029b9bd8cded99dcb28df4126a93fa5d7ca0b384e482"} Apr 20 19:21:17.147738 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:17.147700 2583 generic.go:358] "Generic (PLEG): container finished" podID="7bd8c972-6266-4aa2-893c-8db4ba13e971" containerID="9b4cce7aa7e5258b90b4fdb7c51923acc6c19b279c4fe2c8e9d779947049a932" exitCode=0 Apr 20 19:21:17.148162 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:17.147778 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" event={"ID":"7bd8c972-6266-4aa2-893c-8db4ba13e971","Type":"ContainerDied","Data":"9b4cce7aa7e5258b90b4fdb7c51923acc6c19b279c4fe2c8e9d779947049a932"} Apr 20 19:21:18.153073 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:18.153038 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" event={"ID":"7bd8c972-6266-4aa2-893c-8db4ba13e971","Type":"ContainerStarted","Data":"f613b7c03ab42194f4b05eb327d2f0202d9405230ee89506cb6e50405b45f48b"} Apr 20 19:21:18.153467 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:18.153265 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:18.171218 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:18.171175 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" podStartSLOduration=7.9581411939999995 podStartE2EDuration="8.171162785s" podCreationTimestamp="2026-04-20 19:21:10 +0000 UTC" firstStartedPulling="2026-04-20 19:21:17.148466476 +0000 UTC m=+783.380756962" lastFinishedPulling="2026-04-20 19:21:17.361488066 +0000 UTC m=+783.593778553" observedRunningTime="2026-04-20 19:21:18.170712495 +0000 UTC m=+784.403003017" watchObservedRunningTime="2026-04-20 19:21:18.171162785 +0000 UTC m=+784.403453355" Apr 20 19:21:29.177702 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:29.177674 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-f6h6q" Apr 20 19:21:40.422863 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.422780 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-748f7b8d66-bdw2z"] Apr 20 19:21:40.423283 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.422997 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" podUID="600058fc-4f3f-48a4-a74b-3f2ae38bedcd" containerName="authorino" containerID="cri-o://aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e" gracePeriod=30 Apr 20 19:21:40.674002 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.673948 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:21:40.765530 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.765496 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-tls-cert\") pod \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\" (UID: \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\") " Apr 20 19:21:40.765695 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.765593 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhxlw\" (UniqueName: \"kubernetes.io/projected/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-kube-api-access-nhxlw\") pod \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\" (UID: \"600058fc-4f3f-48a4-a74b-3f2ae38bedcd\") " Apr 20 19:21:40.767730 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.767702 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-kube-api-access-nhxlw" (OuterVolumeSpecName: "kube-api-access-nhxlw") pod "600058fc-4f3f-48a4-a74b-3f2ae38bedcd" (UID: "600058fc-4f3f-48a4-a74b-3f2ae38bedcd"). InnerVolumeSpecName "kube-api-access-nhxlw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:21:40.775965 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.775940 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "600058fc-4f3f-48a4-a74b-3f2ae38bedcd" (UID: "600058fc-4f3f-48a4-a74b-3f2ae38bedcd"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:21:40.867067 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.867035 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nhxlw\" (UniqueName: \"kubernetes.io/projected/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-kube-api-access-nhxlw\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:21:40.867067 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:40.867062 2583 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/600058fc-4f3f-48a4-a74b-3f2ae38bedcd-tls-cert\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:21:41.251608 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.251575 2583 generic.go:358] "Generic (PLEG): container finished" podID="600058fc-4f3f-48a4-a74b-3f2ae38bedcd" containerID="aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e" exitCode=0 Apr 20 19:21:41.251805 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.251628 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" Apr 20 19:21:41.251805 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.251656 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" event={"ID":"600058fc-4f3f-48a4-a74b-3f2ae38bedcd","Type":"ContainerDied","Data":"aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e"} Apr 20 19:21:41.251805 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.251691 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-748f7b8d66-bdw2z" event={"ID":"600058fc-4f3f-48a4-a74b-3f2ae38bedcd","Type":"ContainerDied","Data":"31eef648b17e7d067cb5d25e593720977d443bf78bc28d44b68eb9a9a5b3b8b6"} Apr 20 19:21:41.251805 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.251706 2583 scope.go:117] "RemoveContainer" containerID="aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e" Apr 20 19:21:41.261163 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.261144 2583 scope.go:117] "RemoveContainer" containerID="aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e" Apr 20 19:21:41.261455 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:21:41.261437 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e\": container with ID starting with aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e not found: ID does not exist" containerID="aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e" Apr 20 19:21:41.261519 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.261464 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e"} err="failed to get container status \"aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e\": rpc error: code = NotFound desc = could not find container \"aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e\": container with ID starting with aa32c6abc4da2c15e81f684bbe0819248ed04a0623c7a9b1d4b751ededc31e8e not found: ID does not exist" Apr 20 19:21:41.273137 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.273110 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-748f7b8d66-bdw2z"] Apr 20 19:21:41.278319 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:41.278291 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-748f7b8d66-bdw2z"] Apr 20 19:21:42.362058 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:21:42.362010 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="600058fc-4f3f-48a4-a74b-3f2ae38bedcd" path="/var/lib/kubelet/pods/600058fc-4f3f-48a4-a74b-3f2ae38bedcd/volumes" Apr 20 19:23:14.302113 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:23:14.302029 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:23:14.302619 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:23:14.302476 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:28:14.332271 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:28:14.332241 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:28:14.334734 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:28:14.334203 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:30:00.143787 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.143751 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611890-xsnww"] Apr 20 19:30:00.144273 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.144109 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="600058fc-4f3f-48a4-a74b-3f2ae38bedcd" containerName="authorino" Apr 20 19:30:00.144273 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.144119 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="600058fc-4f3f-48a4-a74b-3f2ae38bedcd" containerName="authorino" Apr 20 19:30:00.144273 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.144188 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="600058fc-4f3f-48a4-a74b-3f2ae38bedcd" containerName="authorino" Apr 20 19:30:00.147158 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.147142 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" Apr 20 19:30:00.149925 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.149905 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qbkhb\"" Apr 20 19:30:00.163258 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.163230 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611890-xsnww"] Apr 20 19:30:00.223865 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.223833 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mf9\" (UniqueName: \"kubernetes.io/projected/63d994a8-a044-407e-b8c5-1ae8657419bc-kube-api-access-59mf9\") pod \"maas-api-key-cleanup-29611890-xsnww\" (UID: \"63d994a8-a044-407e-b8c5-1ae8657419bc\") " pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" Apr 20 19:30:00.324731 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.324700 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59mf9\" (UniqueName: \"kubernetes.io/projected/63d994a8-a044-407e-b8c5-1ae8657419bc-kube-api-access-59mf9\") pod \"maas-api-key-cleanup-29611890-xsnww\" (UID: \"63d994a8-a044-407e-b8c5-1ae8657419bc\") " pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" Apr 20 19:30:00.333942 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.333905 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mf9\" (UniqueName: \"kubernetes.io/projected/63d994a8-a044-407e-b8c5-1ae8657419bc-kube-api-access-59mf9\") pod \"maas-api-key-cleanup-29611890-xsnww\" (UID: \"63d994a8-a044-407e-b8c5-1ae8657419bc\") " pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" Apr 20 19:30:00.457718 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.457623 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" Apr 20 19:30:00.791636 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.791605 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611890-xsnww"] Apr 20 19:30:00.792698 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:30:00.792672 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63d994a8_a044_407e_b8c5_1ae8657419bc.slice/crio-34a22d5637f7a74e47c2b7bdca1ccfd007e935c1e13e85f3fc64a774bf55badf WatchSource:0}: Error finding container 34a22d5637f7a74e47c2b7bdca1ccfd007e935c1e13e85f3fc64a774bf55badf: Status 404 returned error can't find the container with id 34a22d5637f7a74e47c2b7bdca1ccfd007e935c1e13e85f3fc64a774bf55badf Apr 20 19:30:00.794430 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:00.794411 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:30:01.219952 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:01.219913 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" event={"ID":"63d994a8-a044-407e-b8c5-1ae8657419bc","Type":"ContainerStarted","Data":"34a22d5637f7a74e47c2b7bdca1ccfd007e935c1e13e85f3fc64a774bf55badf"} Apr 20 19:30:04.233882 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:04.233845 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" event={"ID":"63d994a8-a044-407e-b8c5-1ae8657419bc","Type":"ContainerStarted","Data":"b0ff789140a0c10060cc12884c3d9dec0837fb9b3a2727547f6ad1bf2993356a"} Apr 20 19:30:04.249336 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:04.249267 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" podStartSLOduration=2.046278978 podStartE2EDuration="4.249254273s" podCreationTimestamp="2026-04-20 19:30:00 +0000 UTC" firstStartedPulling="2026-04-20 19:30:00.79455121 +0000 UTC m=+1307.026841696" lastFinishedPulling="2026-04-20 19:30:02.997526496 +0000 UTC m=+1309.229816991" observedRunningTime="2026-04-20 19:30:04.248153279 +0000 UTC m=+1310.480443783" watchObservedRunningTime="2026-04-20 19:30:04.249254273 +0000 UTC m=+1310.481544781" Apr 20 19:30:24.316984 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:24.316946 2583 generic.go:358] "Generic (PLEG): container finished" podID="63d994a8-a044-407e-b8c5-1ae8657419bc" containerID="b0ff789140a0c10060cc12884c3d9dec0837fb9b3a2727547f6ad1bf2993356a" exitCode=6 Apr 20 19:30:24.317441 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:24.317013 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" event={"ID":"63d994a8-a044-407e-b8c5-1ae8657419bc","Type":"ContainerDied","Data":"b0ff789140a0c10060cc12884c3d9dec0837fb9b3a2727547f6ad1bf2993356a"} Apr 20 19:30:24.317441 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:24.317374 2583 scope.go:117] "RemoveContainer" containerID="b0ff789140a0c10060cc12884c3d9dec0837fb9b3a2727547f6ad1bf2993356a" Apr 20 19:30:25.322714 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:25.322679 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" event={"ID":"63d994a8-a044-407e-b8c5-1ae8657419bc","Type":"ContainerStarted","Data":"ee059cb6c3a49f60c052eb2b657245f6e483cfd41005010f70acd6503e9ca15b"} Apr 20 19:30:45.402213 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:45.402180 2583 generic.go:358] "Generic (PLEG): container finished" podID="63d994a8-a044-407e-b8c5-1ae8657419bc" containerID="ee059cb6c3a49f60c052eb2b657245f6e483cfd41005010f70acd6503e9ca15b" exitCode=6 Apr 20 19:30:45.402768 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:45.402253 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" event={"ID":"63d994a8-a044-407e-b8c5-1ae8657419bc","Type":"ContainerDied","Data":"ee059cb6c3a49f60c052eb2b657245f6e483cfd41005010f70acd6503e9ca15b"} Apr 20 19:30:45.402768 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:45.402300 2583 scope.go:117] "RemoveContainer" containerID="b0ff789140a0c10060cc12884c3d9dec0837fb9b3a2727547f6ad1bf2993356a" Apr 20 19:30:45.402768 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:30:45.402658 2583 scope.go:117] "RemoveContainer" containerID="ee059cb6c3a49f60c052eb2b657245f6e483cfd41005010f70acd6503e9ca15b" Apr 20 19:30:45.402937 ip-10-0-139-126 kubenswrapper[2583]: E0420 19:30:45.402907 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611890-xsnww_opendatahub(63d994a8-a044-407e-b8c5-1ae8657419bc)\"" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" podUID="63d994a8-a044-407e-b8c5-1ae8657419bc" Apr 20 19:31:00.011811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.011764 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611890-xsnww"] Apr 20 19:31:00.143000 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.142974 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" Apr 20 19:31:00.259355 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.259296 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mf9\" (UniqueName: \"kubernetes.io/projected/63d994a8-a044-407e-b8c5-1ae8657419bc-kube-api-access-59mf9\") pod \"63d994a8-a044-407e-b8c5-1ae8657419bc\" (UID: \"63d994a8-a044-407e-b8c5-1ae8657419bc\") " Apr 20 19:31:00.261666 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.261643 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d994a8-a044-407e-b8c5-1ae8657419bc-kube-api-access-59mf9" (OuterVolumeSpecName: "kube-api-access-59mf9") pod "63d994a8-a044-407e-b8c5-1ae8657419bc" (UID: "63d994a8-a044-407e-b8c5-1ae8657419bc"). InnerVolumeSpecName "kube-api-access-59mf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:00.360373 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.360329 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-59mf9\" (UniqueName: \"kubernetes.io/projected/63d994a8-a044-407e-b8c5-1ae8657419bc-kube-api-access-59mf9\") on node \"ip-10-0-139-126.ec2.internal\" DevicePath \"\"" Apr 20 19:31:00.463931 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.463899 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" Apr 20 19:31:00.464103 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.463928 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611890-xsnww" event={"ID":"63d994a8-a044-407e-b8c5-1ae8657419bc","Type":"ContainerDied","Data":"34a22d5637f7a74e47c2b7bdca1ccfd007e935c1e13e85f3fc64a774bf55badf"} Apr 20 19:31:00.464103 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.463977 2583 scope.go:117] "RemoveContainer" containerID="ee059cb6c3a49f60c052eb2b657245f6e483cfd41005010f70acd6503e9ca15b" Apr 20 19:31:00.487360 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.487279 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611890-xsnww"] Apr 20 19:31:00.488552 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:00.488527 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611890-xsnww"] Apr 20 19:31:02.361440 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:31:02.361407 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d994a8-a044-407e-b8c5-1ae8657419bc" path="/var/lib/kubelet/pods/63d994a8-a044-407e-b8c5-1ae8657419bc/volumes" Apr 20 19:33:14.371990 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:33:14.371885 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:33:14.376052 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:33:14.374109 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:38:14.406319 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:38:14.406189 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:38:14.413201 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:38:14.409844 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:43:14.436776 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:43:14.436670 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:43:14.442278 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:43:14.442258 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:44:11.074148 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:11.074066 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c77764cd6-lcfgt_ed10f0f0-205a-48fe-9a6e-13bd52000a30/manager/0.log" Apr 20 19:44:12.064904 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.064872 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s_6d1294cc-e8dd-4819-ab88-7bf81cc2695c/pull/0.log" Apr 20 19:44:12.071517 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.071493 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s_6d1294cc-e8dd-4819-ab88-7bf81cc2695c/extract/0.log" Apr 20 19:44:12.077748 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.077727 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s_6d1294cc-e8dd-4819-ab88-7bf81cc2695c/util/0.log" Apr 20 19:44:12.189263 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.189236 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz_4502f57e-2135-44c0-ae6b-b58286052cb1/util/0.log" Apr 20 19:44:12.195432 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.195409 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz_4502f57e-2135-44c0-ae6b-b58286052cb1/pull/0.log" Apr 20 19:44:12.201348 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.201330 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz_4502f57e-2135-44c0-ae6b-b58286052cb1/extract/0.log" Apr 20 19:44:12.308918 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.308891 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr_dc114f12-adc1-421a-96a0-569fc1da86aa/pull/0.log" Apr 20 19:44:12.314531 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.314508 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr_dc114f12-adc1-421a-96a0-569fc1da86aa/extract/0.log" Apr 20 19:44:12.319700 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.319681 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr_dc114f12-adc1-421a-96a0-569fc1da86aa/util/0.log" Apr 20 19:44:12.428175 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.428147 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7_116a59bd-7951-46e1-9b32-6b87612bb943/util/0.log" Apr 20 19:44:12.434246 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.434225 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7_116a59bd-7951-46e1-9b32-6b87612bb943/pull/0.log" Apr 20 19:44:12.439592 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.439574 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7_116a59bd-7951-46e1-9b32-6b87612bb943/extract/0.log" Apr 20 19:44:12.668613 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.668542 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-rtcdw_9c2ffd5e-2a96-4e32-b4df-6b105a8a5339/manager/0.log" Apr 20 19:44:12.780111 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:12.780080 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9kfbb_ce6fd9d3-3d80-479c-8d81-378eb8d656e9/manager/0.log" Apr 20 19:44:13.013206 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:13.013134 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-nbm7p_b74fbe7d-f0b6-49ce-a4e7-32a21289dab5/registry-server/0.log" Apr 20 19:44:13.365843 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:13.365812 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-gbrtw_2e8d84a0-947d-4f9c-82a5-340fe34abf01/manager/0.log" Apr 20 19:44:13.714369 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:13.714278 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f8jvsv_16786c15-23e8-40e4-aacf-5e901d7ae46e/istio-proxy/0.log" Apr 20 19:44:14.170350 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:14.170301 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-f4xmt_772cb982-87f2-4078-9b45-47fd619a29ac/istio-proxy/0.log" Apr 20 19:44:14.284079 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:14.284052 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7fb965fb56-9fs7v_aa77ff70-05c0-428a-9836-2f6d05ddecd7/router/0.log" Apr 20 19:44:14.844160 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:14.844128 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-f6h6q_7bd8c972-6266-4aa2-893c-8db4ba13e971/storage-initializer/0.log" Apr 20 19:44:14.850340 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:14.850299 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-f6h6q_7bd8c972-6266-4aa2-893c-8db4ba13e971/main/0.log" Apr 20 19:44:15.080178 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:15.080145 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm_95a9da12-a31b-455c-b073-3506ba305a8f/main/0.log" Apr 20 19:44:15.085887 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:15.085865 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-84qqm_95a9da12-a31b-455c-b073-3506ba305a8f/storage-initializer/0.log" Apr 20 19:44:22.479960 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:22.479927 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rrw7f_d447757f-829f-4f8a-8032-c70450b9f31f/global-pull-secret-syncer/0.log" Apr 20 19:44:22.660520 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:22.660493 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wnxqf_846d14a0-81fa-4701-9219-6c5631b28c34/konnectivity-agent/0.log" Apr 20 19:44:22.718163 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:22.718139 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-126.ec2.internal_ec9ad27b137077b1d073e8b7bc68876b/haproxy/0.log" Apr 20 19:44:25.871049 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:25.871015 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s_6d1294cc-e8dd-4819-ab88-7bf81cc2695c/extract/0.log" Apr 20 19:44:25.894936 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:25.894907 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s_6d1294cc-e8dd-4819-ab88-7bf81cc2695c/util/0.log" Apr 20 19:44:25.911254 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:25.911230 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759sp29s_6d1294cc-e8dd-4819-ab88-7bf81cc2695c/pull/0.log" Apr 20 19:44:25.939932 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:25.939901 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz_4502f57e-2135-44c0-ae6b-b58286052cb1/extract/0.log" Apr 20 19:44:25.952424 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:25.952404 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz_4502f57e-2135-44c0-ae6b-b58286052cb1/util/0.log" Apr 20 19:44:25.968690 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:25.968668 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05mptz_4502f57e-2135-44c0-ae6b-b58286052cb1/pull/0.log" Apr 20 19:44:25.989194 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:25.989172 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr_dc114f12-adc1-421a-96a0-569fc1da86aa/extract/0.log" Apr 20 19:44:26.007175 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.007147 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr_dc114f12-adc1-421a-96a0-569fc1da86aa/util/0.log" Apr 20 19:44:26.022855 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.022833 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735fdrr_dc114f12-adc1-421a-96a0-569fc1da86aa/pull/0.log" Apr 20 19:44:26.051179 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.051158 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7_116a59bd-7951-46e1-9b32-6b87612bb943/extract/0.log" Apr 20 19:44:26.073077 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.073050 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7_116a59bd-7951-46e1-9b32-6b87612bb943/util/0.log" Apr 20 19:44:26.091446 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.091431 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef18d7b7_116a59bd-7951-46e1-9b32-6b87612bb943/pull/0.log" Apr 20 19:44:26.360133 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.360105 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-rtcdw_9c2ffd5e-2a96-4e32-b4df-6b105a8a5339/manager/0.log" Apr 20 19:44:26.395295 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.395268 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-9kfbb_ce6fd9d3-3d80-479c-8d81-378eb8d656e9/manager/0.log" Apr 20 19:44:26.452561 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.452531 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-nbm7p_b74fbe7d-f0b6-49ce-a4e7-32a21289dab5/registry-server/0.log" Apr 20 19:44:26.614519 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:26.614488 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-gbrtw_2e8d84a0-947d-4f9c-82a5-340fe34abf01/manager/0.log" Apr 20 19:44:28.237068 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.237032 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtgnn_05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86/node-exporter/0.log" Apr 20 19:44:28.254768 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.254741 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtgnn_05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86/kube-rbac-proxy/0.log" Apr 20 19:44:28.268735 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.268709 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xtgnn_05f31ef0-ffd3-4e69-925a-2f9cf6f8fc86/init-textfile/0.log" Apr 20 19:44:28.357097 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.357071 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-nbtnh_521e9e92-3741-45e7-9ca3-a8f8e0338cec/kube-rbac-proxy-main/0.log" Apr 20 19:44:28.373560 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.373533 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-nbtnh_521e9e92-3741-45e7-9ca3-a8f8e0338cec/kube-rbac-proxy-self/0.log" Apr 20 19:44:28.391285 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.391262 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-nbtnh_521e9e92-3741-45e7-9ca3-a8f8e0338cec/openshift-state-metrics/0.log" Apr 20 19:44:28.741327 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.741276 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b866f9874-rmh9f_fdd2c82f-fd4f-4928-812f-989d513cf8f6/thanos-query/0.log" Apr 20 19:44:28.758266 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.758237 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b866f9874-rmh9f_fdd2c82f-fd4f-4928-812f-989d513cf8f6/kube-rbac-proxy-web/0.log" Apr 20 19:44:28.774726 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.774700 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b866f9874-rmh9f_fdd2c82f-fd4f-4928-812f-989d513cf8f6/kube-rbac-proxy/0.log" Apr 20 19:44:28.789703 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.789679 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b866f9874-rmh9f_fdd2c82f-fd4f-4928-812f-989d513cf8f6/prom-label-proxy/0.log" Apr 20 19:44:28.805512 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.805493 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b866f9874-rmh9f_fdd2c82f-fd4f-4928-812f-989d513cf8f6/kube-rbac-proxy-rules/0.log" Apr 20 19:44:28.822529 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:28.822511 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5b866f9874-rmh9f_fdd2c82f-fd4f-4928-812f-989d513cf8f6/kube-rbac-proxy-metrics/0.log" Apr 20 19:44:30.687039 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:30.687005 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/1.log" Apr 20 19:44:30.691930 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:30.691910 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mlqft_9befbc1e-c590-441a-af58-08f4e7b6d9a4/console-operator/2.log" Apr 20 19:44:31.123206 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.123177 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7fd79b5855-7pg5g_f017cc90-7be5-407c-a49d-dc0386f71a29/console/0.log" Apr 20 19:44:31.308142 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.308112 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m"] Apr 20 19:44:31.308493 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.308481 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63d994a8-a044-407e-b8c5-1ae8657419bc" containerName="cleanup" Apr 20 19:44:31.308544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.308495 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d994a8-a044-407e-b8c5-1ae8657419bc" containerName="cleanup" Apr 20 19:44:31.308544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.308506 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63d994a8-a044-407e-b8c5-1ae8657419bc" containerName="cleanup" Apr 20 19:44:31.308544 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.308512 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d994a8-a044-407e-b8c5-1ae8657419bc" containerName="cleanup" Apr 20 19:44:31.308644 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.308580 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="63d994a8-a044-407e-b8c5-1ae8657419bc" containerName="cleanup" Apr 20 19:44:31.308644 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.308588 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="63d994a8-a044-407e-b8c5-1ae8657419bc" containerName="cleanup" Apr 20 19:44:31.311642 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.311627 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.314081 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.314053 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xgkc7\"/\"default-dockercfg-7j246\"" Apr 20 19:44:31.314242 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.314096 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xgkc7\"/\"kube-root-ca.crt\"" Apr 20 19:44:31.315219 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.315197 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xgkc7\"/\"openshift-service-ca.crt\"" Apr 20 19:44:31.321593 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.321570 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m"] Apr 20 19:44:31.442412 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.442325 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-podres\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.442562 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.442434 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-sys\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.442562 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.442467 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-lib-modules\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.442562 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.442505 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-proc\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.442562 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.442527 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv2j9\" (UniqueName: \"kubernetes.io/projected/04f84f6c-94db-4ba9-be59-b4b340c850b0-kube-api-access-rv2j9\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.542973 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.542943 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-podres\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.543138 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.543013 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-sys\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.543138 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.543035 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-lib-modules\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.543138 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.543085 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-proc\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.543138 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.543100 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-podres\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.543138 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.543131 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv2j9\" (UniqueName: \"kubernetes.io/projected/04f84f6c-94db-4ba9-be59-b4b340c850b0-kube-api-access-rv2j9\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.543355 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.543148 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-proc\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.543355 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.543131 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-sys\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.543355 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.543186 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04f84f6c-94db-4ba9-be59-b4b340c850b0-lib-modules\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.550916 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.550891 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv2j9\" (UniqueName: \"kubernetes.io/projected/04f84f6c-94db-4ba9-be59-b4b340c850b0-kube-api-access-rv2j9\") pod \"perf-node-gather-daemonset-77s7m\" (UID: \"04f84f6c-94db-4ba9-be59-b4b340c850b0\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.622765 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.622730 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:31.754107 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.754082 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m"] Apr 20 19:44:31.755990 ip-10-0-139-126 kubenswrapper[2583]: W0420 19:44:31.755953 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod04f84f6c_94db_4ba9_be59_b4b340c850b0.slice/crio-72fee97ef63eea76846c215b6ea81ed44c66b48e15aff839b1d57e7ccc2f939d WatchSource:0}: Error finding container 72fee97ef63eea76846c215b6ea81ed44c66b48e15aff839b1d57e7ccc2f939d: Status 404 returned error can't find the container with id 72fee97ef63eea76846c215b6ea81ed44c66b48e15aff839b1d57e7ccc2f939d Apr 20 19:44:31.757595 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:31.757576 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:44:32.515542 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:32.515510 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tnhlk_34e2771f-67fc-4041-92d6-4479b21afc45/dns/0.log" Apr 20 19:44:32.531602 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:32.531574 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tnhlk_34e2771f-67fc-4041-92d6-4479b21afc45/kube-rbac-proxy/0.log" Apr 20 19:44:32.591818 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:32.591793 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qp25w_406ba3b2-33e1-4cc9-941c-55a06a114e38/dns-node-resolver/0.log" Apr 20 19:44:32.636762 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:32.636732 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" event={"ID":"04f84f6c-94db-4ba9-be59-b4b340c850b0","Type":"ContainerStarted","Data":"b55dee78e98d30e2e360231a14a08fa47f257fe8315f21e68e3920a7b3ccd367"} Apr 20 19:44:32.636762 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:32.636766 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" event={"ID":"04f84f6c-94db-4ba9-be59-b4b340c850b0","Type":"ContainerStarted","Data":"72fee97ef63eea76846c215b6ea81ed44c66b48e15aff839b1d57e7ccc2f939d"} Apr 20 19:44:32.636989 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:32.636882 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:32.653675 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:32.653627 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" podStartSLOduration=1.653613167 podStartE2EDuration="1.653613167s" podCreationTimestamp="2026-04-20 19:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:44:32.6512787 +0000 UTC m=+2178.883569220" watchObservedRunningTime="2026-04-20 19:44:32.653613167 +0000 UTC m=+2178.885903675" Apr 20 19:44:33.147079 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:33.147054 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nd8bd_8adeca3e-66a6-48a4-81e8-13898bdffa54/node-ca/0.log" Apr 20 19:44:33.877380 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:33.877355 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f8jvsv_16786c15-23e8-40e4-aacf-5e901d7ae46e/istio-proxy/0.log" Apr 20 19:44:34.125520 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:34.125491 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-f4xmt_772cb982-87f2-4078-9b45-47fd619a29ac/istio-proxy/0.log" Apr 20 19:44:34.145418 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:34.145340 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7fb965fb56-9fs7v_aa77ff70-05c0-428a-9836-2f6d05ddecd7/router/0.log" Apr 20 19:44:34.643811 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:34.643778 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-w5vbd_b4d15e0b-c594-487d-9587-1c87df888ece/serve-healthcheck-canary/0.log" Apr 20 19:44:35.076454 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:35.076429 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vrlc6_435b37d5-2251-4632-a312-07617c9af5af/kube-rbac-proxy/0.log" Apr 20 19:44:35.091128 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:35.091109 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vrlc6_435b37d5-2251-4632-a312-07617c9af5af/exporter/0.log" Apr 20 19:44:35.106767 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:35.106746 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vrlc6_435b37d5-2251-4632-a312-07617c9af5af/extractor/0.log" Apr 20 19:44:37.239549 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:37.239521 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c77764cd6-lcfgt_ed10f0f0-205a-48fe-9a6e-13bd52000a30/manager/0.log" Apr 20 19:44:38.330880 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:38.330849 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fcf468c68-lzc9n_919702f8-9e00-4cf6-b1cc-f45966b21a1d/manager/0.log" Apr 20 19:44:38.348783 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:38.348747 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-ztkbw_1ec94478-e96e-4ecf-b153-39178810d00b/openshift-lws-operator/0.log" Apr 20 19:44:38.651322 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:38.651233 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-77s7m" Apr 20 19:44:42.886892 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:42.886855 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b5mfm_7df06148-0d5a-4e8d-ba9d-6f7e64734e95/kube-storage-version-migrator-operator/1.log" Apr 20 19:44:42.888095 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:42.888076 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-b5mfm_7df06148-0d5a-4e8d-ba9d-6f7e64734e95/kube-storage-version-migrator-operator/0.log" Apr 20 19:44:43.856580 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:43.856551 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9kdg_0eb04f4c-6eed-4f33-ae86-5126283315de/kube-multus-additional-cni-plugins/0.log" Apr 20 19:44:43.873516 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:43.873477 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9kdg_0eb04f4c-6eed-4f33-ae86-5126283315de/egress-router-binary-copy/0.log" Apr 20 19:44:43.888604 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:43.888579 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9kdg_0eb04f4c-6eed-4f33-ae86-5126283315de/cni-plugins/0.log" Apr 20 19:44:43.904369 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:43.904351 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9kdg_0eb04f4c-6eed-4f33-ae86-5126283315de/bond-cni-plugin/0.log" Apr 20 19:44:43.921001 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:43.920981 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9kdg_0eb04f4c-6eed-4f33-ae86-5126283315de/routeoverride-cni/0.log" Apr 20 19:44:43.943726 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:43.943701 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9kdg_0eb04f4c-6eed-4f33-ae86-5126283315de/whereabouts-cni-bincopy/0.log" Apr 20 19:44:43.959246 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:43.959223 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m9kdg_0eb04f4c-6eed-4f33-ae86-5126283315de/whereabouts-cni/0.log" Apr 20 19:44:44.160484 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:44.160403 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pm94g_978e6f95-f2d1-47d8-a94e-f79ad5c672d6/kube-multus/0.log" Apr 20 19:44:44.196667 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:44.196640 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9gbcz_45a7c0b2-25d6-499d-9c36-ca4ace9c7813/network-metrics-daemon/0.log" Apr 20 19:44:44.212011 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:44.211989 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9gbcz_45a7c0b2-25d6-499d-9c36-ca4ace9c7813/kube-rbac-proxy/0.log" Apr 20 19:44:45.020834 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:45.020803 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46gfp_8cbf6223-eb9e-4c30-9be7-c289ced45992/ovn-controller/0.log" Apr 20 19:44:45.044947 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:45.044920 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46gfp_8cbf6223-eb9e-4c30-9be7-c289ced45992/ovn-acl-logging/0.log" Apr 20 19:44:45.059689 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:45.059667 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46gfp_8cbf6223-eb9e-4c30-9be7-c289ced45992/kube-rbac-proxy-node/0.log" Apr 20 19:44:45.076170 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:45.076139 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46gfp_8cbf6223-eb9e-4c30-9be7-c289ced45992/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 19:44:45.088353 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:45.088331 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46gfp_8cbf6223-eb9e-4c30-9be7-c289ced45992/northd/0.log" Apr 20 19:44:45.102793 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:45.102767 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46gfp_8cbf6223-eb9e-4c30-9be7-c289ced45992/nbdb/0.log" Apr 20 19:44:45.119578 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:45.119553 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46gfp_8cbf6223-eb9e-4c30-9be7-c289ced45992/sbdb/0.log" Apr 20 19:44:45.219865 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:45.219832 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-46gfp_8cbf6223-eb9e-4c30-9be7-c289ced45992/ovnkube-controller/0.log" Apr 20 19:44:46.792816 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:46.792731 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-6tvvj_efd97a6a-3f36-4480-9d4f-2e8113c15af9/check-endpoints/0.log" Apr 20 19:44:46.850546 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:46.850514 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xb2cz_d79586bb-910e-443a-be9f-96dd10cd1d31/network-check-target-container/0.log" Apr 20 19:44:47.777929 ip-10-0-139-126 kubenswrapper[2583]: I0420 19:44:47.777900 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jbf6q_1c7034ae-80c0-4dd1-9098-c92022ad516a/iptables-alerter/0.log"