Apr 20 14:26:59.697060 ip-10-0-139-136 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:27:00.134777 ip-10-0-139-136 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:27:00.134777 ip-10-0-139-136 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:27:00.134777 ip-10-0-139-136 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:27:00.134777 ip-10-0-139-136 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:27:00.134777 ip-10-0-139-136 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:27:00.136103 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.136020 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:27:00.139129 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139114 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:27:00.139129 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139129 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139133 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139136 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139139 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139142 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139145 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139148 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139151 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139154 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139162 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139165 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139167 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139170 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139172 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139175 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139177 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139180 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139183 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139185 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139188 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:27:00.139193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139190 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139193 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139196 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139198 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139201 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139204 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139207 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139209 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139212 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139214 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139217 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139219 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139222 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139224 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139226 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139229 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139232 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139234 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139238 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:27:00.139701 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139242 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139245 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139247 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139249 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139252 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139254 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139258 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139260 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139262 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139265 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139267 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139270 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139272 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139274 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139277 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139281 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139284 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139287 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139290 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139292 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:27:00.140218 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139295 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139297 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139300 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139302 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139306 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139310 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139314 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139317 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139320 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139323 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139325 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139328 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139330 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139333 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139336 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139338 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139340 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139343 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139345 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139347 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:27:00.140697 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139350 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139352 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139355 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139358 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139360 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139362 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139777 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139784 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139788 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139791 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139794 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139796 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139799 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139802 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139805 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139808 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139813 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139816 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139818 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139821 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:27:00.141193 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139823 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139826 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139828 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139831 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139834 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139836 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139838 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139841 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139843 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139846 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139848 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139851 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139853 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139855 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139858 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139860 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139863 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139866 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139869 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139871 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:27:00.141688 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139874 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139876 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139879 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139881 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139883 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139886 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139889 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139891 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139894 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139896 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139899 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139901 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139904 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139906 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139908 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139911 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139914 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139917 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139920 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:27:00.142197 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139922 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139925 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139927 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139929 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139932 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139934 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139936 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139939 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139941 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139944 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139946 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139949 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139951 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139954 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139956 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139959 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139961 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139964 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139966 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139968 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:27:00.142707 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139971 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139974 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139976 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139979 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139981 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139984 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139986 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139990 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139996 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.139998 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.140001 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.140004 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.140006 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141367 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141376 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141382 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141387 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141391 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141394 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141399 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141404 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:27:00.143212 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141407 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141410 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141413 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141417 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141420 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141422 2581 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141425 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141428 2581 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141431 2581 flags.go:64] FLAG: --cloud-config="" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141434 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141436 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141441 2581 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141444 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141447 2581 flags.go:64] FLAG: --config-dir="" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141450 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141453 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141457 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141460 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141463 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141467 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141470 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141472 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141475 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141478 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141481 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:27:00.143744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141486 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141489 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141491 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141494 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141497 2581 flags.go:64] FLAG: --enable-server="true" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141500 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141505 2581 flags.go:64] FLAG: --event-burst="100" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141508 2581 flags.go:64] FLAG: --event-qps="50" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141511 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141514 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141517 2581 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141521 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141524 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141527 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141530 2581 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141533 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141536 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141538 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141541 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141544 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141547 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141549 2581 flags.go:64] FLAG: --feature-gates="" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141553 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141556 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141559 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:27:00.144349 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141563 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141566 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141569 2581 flags.go:64] FLAG: --help="false" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141572 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141575 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141578 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141581 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141584 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141587 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141590 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141593 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141595 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141598 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141601 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141604 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141607 2581 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141610 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141612 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141615 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141618 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141621 2581 flags.go:64] FLAG: --lock-file="" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141624 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141627 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141630 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:27:00.145011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141635 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141638 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141640 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141643 2581 flags.go:64] FLAG: --logging-format="text" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141646 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141649 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141651 2581 flags.go:64] FLAG: --manifest-url="" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141655 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141659 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141662 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141667 2581 flags.go:64] FLAG: --max-pods="110" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141669 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141682 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141685 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141688 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141692 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141695 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141698 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141707 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141709 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141712 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141715 2581 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141718 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:27:00.145577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141737 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141740 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141743 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141746 2581 flags.go:64] FLAG: --port="10250" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141750 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141752 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0440871f06f945b03" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141756 2581 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141758 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141761 2581 flags.go:64] FLAG: --register-node="true" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141764 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141767 2581 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141770 2581 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141773 2581 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141776 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141778 2581 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141782 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141785 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141788 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141791 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141794 2581 flags.go:64] FLAG: --runonce="false" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141797 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141800 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141803 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141806 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141808 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141811 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:27:00.146168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141814 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141817 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141820 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141823 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141826 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141828 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141831 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141834 2581 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141837 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141842 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141845 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141848 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141852 2581 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141855 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141857 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141860 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141863 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141866 2581 flags.go:64] FLAG: --v="2" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141870 2581 flags.go:64] FLAG: --version="false" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141873 2581 flags.go:64] FLAG: --vmodule="" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141877 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.141881 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.141977 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.141981 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:27:00.146840 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.141986 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.141989 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.141992 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.141995 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.141997 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142000 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142003 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142005 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142008 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142011 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142013 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142016 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142019 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142021 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142024 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142026 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142029 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142032 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142035 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142037 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:27:00.147401 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142039 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142042 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142045 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142047 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142050 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142052 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142054 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142057 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142059 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142062 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142064 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142067 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142071 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142074 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142076 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142079 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142081 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142083 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142086 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142108 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:27:00.147912 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142112 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142115 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142118 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142121 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142124 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142126 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142129 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142131 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142136 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142139 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142142 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142145 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142147 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142150 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142152 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142154 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142157 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142159 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142162 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:27:00.148445 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142164 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142166 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142169 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142171 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142174 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142178 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142180 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142184 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142186 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142188 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142191 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142193 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142195 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142198 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142202 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142205 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142209 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142212 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142215 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142218 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:27:00.148930 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142220 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:27:00.149421 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142223 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:27:00.149421 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142225 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:27:00.149421 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142227 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:27:00.149421 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.142230 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:27:00.149421 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.142970 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:27:00.149421 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.149382 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:27:00.149421 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.149396 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:27:00.150054 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150015 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:27:00.150054 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150053 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:27:00.150054 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150058 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150063 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150068 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150072 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150110 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150117 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150129 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150134 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150139 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150144 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150148 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150152 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150157 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150161 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150165 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150170 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150174 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150178 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150182 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150192 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:27:00.150254 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150196 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150200 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150204 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150209 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150213 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150219 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150224 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150229 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150232 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150236 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150241 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150280 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150285 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150289 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150293 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150298 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150302 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150306 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150310 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150314 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:27:00.150902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150318 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150321 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150325 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150334 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150337 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150341 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150345 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150349 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150353 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150357 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150361 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150365 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150369 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150373 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150378 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150382 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150391 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150395 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150399 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:27:00.151539 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150403 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150407 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150411 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150415 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150419 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150481 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150486 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150491 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150530 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150552 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150559 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150565 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150569 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150574 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150580 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.150585 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151028 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151144 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151147 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151150 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:27:00.152016 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151153 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151156 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151160 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151163 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151166 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.151172 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151290 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151295 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151298 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151300 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151303 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151306 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151308 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151311 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151313 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:27:00.152515 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151316 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151319 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151322 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151324 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151327 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151329 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151332 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151334 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151336 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151339 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151341 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151344 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151347 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151350 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151352 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151355 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151357 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151360 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151364 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:27:00.152902 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151368 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151370 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151373 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151376 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151379 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151381 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151384 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151387 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151389 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151391 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151394 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151396 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151399 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151401 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151404 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151407 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151410 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151413 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151415 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151418 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:27:00.153359 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151420 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151423 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151425 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151428 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151430 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151433 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151435 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151438 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151440 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151442 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151445 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151447 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151449 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151452 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151454 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151457 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151459 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151462 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151464 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151467 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:27:00.154001 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151469 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151472 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151474 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151476 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151479 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151481 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151483 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151486 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151490 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151493 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151496 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151498 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151501 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151503 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151506 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151508 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151510 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:27:00.154565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:00.151513 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:27:00.155028 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.151518 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:27:00.155028 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.152212 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:27:00.155028 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.154010 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:27:00.155028 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.154847 2581 server.go:1019] "Starting client certificate rotation" Apr 20 14:27:00.155028 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.154939 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:27:00.155782 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.155770 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:27:00.178487 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.178470 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:27:00.184904 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.184886 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:27:00.197894 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.197873 2581 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:27:00.203816 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.203802 2581 log.go:25] "Validated CRI v1 image API" Apr 20 14:27:00.205020 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.205005 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:27:00.209767 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.209740 2581 fs.go:135] Filesystem UUIDs: map[799b861e-2007-4a64-a28a-16d25a99607d:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 d91bf3de-f30b-4771-a0a1-edc6905ee8cb:/dev/nvme0n1p3] Apr 20 14:27:00.209908 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.209873 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:27:00.213689 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.213631 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:27:00.218250 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.218140 2581 manager.go:217] Machine: {Timestamp:2026-04-20 14:27:00.216351028 +0000 UTC m=+0.395839490 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096204 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21c3f14ea74122d97c2b10af60174e SystemUUID:ec21c3f1-4ea7-4122-d97c-2b10af60174e BootID:fa3e8a28-3d08-4e91-bc39-2ef8bb722181 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:98:a9:1d:eb:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:98:a9:1d:eb:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9a:19:bc:d5:d9:c6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:27:00.218250 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.218239 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:27:00.218381 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.218308 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:27:00.220405 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.220383 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:27:00.220538 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.220408 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-136.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:27:00.220579 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.220547 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:27:00.220579 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.220556 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:27:00.220579 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.220569 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:27:00.221585 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.221575 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:27:00.222667 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.222657 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:27:00.222818 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.222809 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:27:00.225043 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.225034 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:27:00.225077 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.225052 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:27:00.225077 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.225063 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:27:00.225077 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.225072 2581 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:27:00.225170 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.225081 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:27:00.226045 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.226033 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:27:00.226087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.226051 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:27:00.228744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.228708 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:27:00.230611 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.230598 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:27:00.231982 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.231965 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:27:00.231982 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.231984 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.231990 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232005 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232010 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232016 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232021 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232027 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232033 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232039 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232054 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:27:00.232082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.232062 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:27:00.233424 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.233407 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nbrd5" Apr 20 14:27:00.233910 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.233898 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:27:00.233910 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.233911 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:27:00.235245 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.235211 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 14:27:00.235331 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.235227 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-136.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 14:27:00.237406 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.237390 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:27:00.237490 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.237432 2581 server.go:1295] "Started kubelet" Apr 20 14:27:00.237543 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.237498 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:27:00.237630 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.237590 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:27:00.237683 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.237645 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:27:00.238218 ip-10-0-139-136 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:27:00.240490 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.240470 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nbrd5" Apr 20 14:27:00.240669 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.240655 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:27:00.241557 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.241539 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:27:00.245280 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.245261 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 14:27:00.245967 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.245954 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:27:00.246029 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.245967 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:27:00.246526 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.246510 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:27:00.246613 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.246603 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:27:00.246674 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.246620 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:27:00.246779 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.246761 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:27:00.246779 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.246776 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:27:00.246940 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.246816 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:00.247069 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.247038 2581 factory.go:153] Registering CRI-O factory Apr 20 14:27:00.247069 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.247063 2581 factory.go:223] Registration of the crio container factory successfully Apr 20 14:27:00.247188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.247116 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:27:00.247188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.247125 2581 factory.go:55] Registering systemd factory Apr 20 14:27:00.247188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.247132 2581 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:27:00.247188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.247151 2581 factory.go:103] Registering Raw factory Apr 20 14:27:00.247188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.247162 2581 manager.go:1196] Started watching for new ooms in manager Apr 20 14:27:00.248106 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.248088 2581 manager.go:319] Starting recovery of all containers Apr 20 14:27:00.249131 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.249109 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:27:00.253335 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.253272 2581 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-136.ec2.internal\" not found" node="ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.253492 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.253284 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-136.ec2.internal" not found Apr 20 14:27:00.262563 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.262546 2581 manager.go:324] Recovery completed Apr 20 14:27:00.266504 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.266491 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:27:00.269463 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.269449 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-136.ec2.internal" not found Apr 20 14:27:00.269916 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.269903 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:27:00.269972 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.269927 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:27:00.269972 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.269937 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:27:00.270399 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.270386 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:27:00.270399 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.270398 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:27:00.270475 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.270414 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:27:00.272479 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.272467 2581 policy_none.go:49] "None policy: Start" Apr 20 14:27:00.272517 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.272484 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:27:00.272517 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.272495 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:27:00.319248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.319234 2581 manager.go:341] "Starting Device Plugin manager" Apr 20 14:27:00.319359 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.319262 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:27:00.319359 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.319271 2581 server.go:85] "Starting device plugin registration server" Apr 20 14:27:00.319483 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.319470 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:27:00.319532 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.319488 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:27:00.319576 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.319565 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:27:00.319658 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.319643 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:27:00.319658 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.319656 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:27:00.320181 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.320162 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:27:00.320256 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.320205 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:00.334313 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.334293 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-136.ec2.internal" not found Apr 20 14:27:00.380984 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.380958 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:27:00.382207 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.382184 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:27:00.382282 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.382220 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:27:00.382282 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.382241 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:27:00.382282 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.382249 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:27:00.382418 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.382330 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:27:00.385577 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.385533 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:27:00.419737 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.419696 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:27:00.420471 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.420457 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:27:00.420533 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.420487 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:27:00.420533 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.420500 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:27:00.420533 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.420521 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.429018 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.429005 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.429086 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.429025 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-136.ec2.internal\": node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:00.454448 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.454430 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:00.482676 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.482656 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal"] Apr 20 14:27:00.482747 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.482735 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:27:00.483489 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.483475 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:27:00.483558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.483501 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:27:00.483558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.483512 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:27:00.484550 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.484538 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:27:00.484698 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.484684 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.484779 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.484712 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:27:00.485113 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.485098 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:27:00.485195 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.485126 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:27:00.485195 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.485136 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:27:00.485195 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.485097 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:27:00.485333 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.485214 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:27:00.485333 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.485232 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:27:00.486521 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.486507 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.486575 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.486533 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:27:00.487144 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.487131 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:27:00.487217 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.487153 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:27:00.487217 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.487162 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:27:00.511347 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.511330 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-136.ec2.internal\" not found" node="ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.515513 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.515497 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-136.ec2.internal\" not found" node="ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.549031 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.549015 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.549114 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.549039 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.549114 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.549068 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b16f66466a195429f73bc1a0dcec09-config\") pod \"kube-apiserver-proxy-ip-10-0-139-136.ec2.internal\" (UID: \"57b16f66466a195429f73bc1a0dcec09\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.555114 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.555102 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:00.649713 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.649640 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.649713 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.649684 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.649713 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.649702 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b16f66466a195429f73bc1a0dcec09-config\") pod \"kube-apiserver-proxy-ip-10-0-139-136.ec2.internal\" (UID: \"57b16f66466a195429f73bc1a0dcec09\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.649904 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.649744 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/57b16f66466a195429f73bc1a0dcec09-config\") pod \"kube-apiserver-proxy-ip-10-0-139-136.ec2.internal\" (UID: \"57b16f66466a195429f73bc1a0dcec09\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.649904 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.649754 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.649904 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.649761 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c05921670fca4842dd48b5deb56ad8b1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal\" (UID: \"c05921670fca4842dd48b5deb56ad8b1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.655761 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.655743 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:00.756505 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.756479 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:00.812591 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.812576 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.817003 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:00.816988 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 20 14:27:00.856798 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.856774 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:00.957277 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:00.957222 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:01.057774 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:01.057744 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:01.155253 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.155227 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:27:01.155925 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.155383 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:27:01.155925 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.155396 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:27:01.158372 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:01.158354 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:01.242939 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.242908 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:22:00 +0000 UTC" deadline="2027-12-28 20:38:35.294176994 +0000 UTC" Apr 20 14:27:01.242939 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.242938 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14814h11m34.051242185s" Apr 20 14:27:01.247187 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.247063 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:27:01.258536 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:01.258515 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-136.ec2.internal\" not found" Apr 20 14:27:01.262602 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.262580 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:27:01.283428 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.283401 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m9jxp" Apr 20 14:27:01.285498 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.285477 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:27:01.291997 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.291968 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m9jxp" Apr 20 14:27:01.340277 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:01.340247 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b16f66466a195429f73bc1a0dcec09.slice/crio-b60fbac6e5ace8f1eeae171d499226baf0588dd4b46839d002cd893c7d049462 WatchSource:0}: Error finding container b60fbac6e5ace8f1eeae171d499226baf0588dd4b46839d002cd893c7d049462: Status 404 returned error can't find the container with id b60fbac6e5ace8f1eeae171d499226baf0588dd4b46839d002cd893c7d049462 Apr 20 14:27:01.345304 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.345289 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:27:01.346902 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.346887 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" Apr 20 14:27:01.358019 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.358003 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:27:01.359558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.359543 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" Apr 20 14:27:01.364103 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:01.364084 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc05921670fca4842dd48b5deb56ad8b1.slice/crio-f8565f9c313800a9a7e3e77647aacce3945b6adcdcc0a2e202663643870a6b11 WatchSource:0}: Error finding container f8565f9c313800a9a7e3e77647aacce3945b6adcdcc0a2e202663643870a6b11: Status 404 returned error can't find the container with id f8565f9c313800a9a7e3e77647aacce3945b6adcdcc0a2e202663643870a6b11 Apr 20 14:27:01.367618 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.367603 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:27:01.385597 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.385553 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" event={"ID":"c05921670fca4842dd48b5deb56ad8b1","Type":"ContainerStarted","Data":"f8565f9c313800a9a7e3e77647aacce3945b6adcdcc0a2e202663643870a6b11"} Apr 20 14:27:01.388391 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.388361 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" event={"ID":"57b16f66466a195429f73bc1a0dcec09","Type":"ContainerStarted","Data":"b60fbac6e5ace8f1eeae171d499226baf0588dd4b46839d002cd893c7d049462"} Apr 20 14:27:01.804739 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:01.804692 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:27:02.226106 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.226072 2581 apiserver.go:52] "Watching apiserver" Apr 20 14:27:02.233875 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.233851 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:27:02.235154 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.235123 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c","openshift-network-diagnostics/network-check-target-s2vgq","openshift-network-operator/iptables-alerter-gfsfz","openshift-cluster-node-tuning-operator/tuned-wbpln","openshift-dns/node-resolver-lmkks","openshift-image-registry/node-ca-q5vqc","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal","openshift-multus/multus-additional-cni-plugins-ddt9k","openshift-multus/multus-jhhn2","openshift-multus/network-metrics-daemon-t787k","openshift-ovn-kubernetes/ovnkube-node-r8xsm","kube-system/konnectivity-agent-fz7q2","kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal"] Apr 20 14:27:02.238044 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.238022 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:02.238147 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.238113 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:02.239204 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.239181 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:02.239325 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.239260 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:02.240186 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.240168 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.241193 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.241173 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.242317 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.242301 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:27:02.242420 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.242312 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:27:02.242420 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.242365 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.242530 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.242304 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8skvl\"" Apr 20 14:27:02.242742 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.242708 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:27:02.243182 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.243165 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:27:02.243274 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.243214 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:27:02.243274 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.243222 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-25znw\"" Apr 20 14:27:02.243525 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.243505 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.243654 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.243634 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.244355 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.244337 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:27:02.244528 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.244514 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:27:02.244604 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.244522 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qhcng\"" Apr 20 14:27:02.245049 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.244810 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.245373 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.245348 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:27:02.245504 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.245489 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:27:02.246192 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.245861 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:27:02.246192 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.245944 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:27:02.246192 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.245961 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.246447 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.246398 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:27:02.246502 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.246452 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lslm2\"" Apr 20 14:27:02.246502 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.246479 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mmqxz\"" Apr 20 14:27:02.246665 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.246651 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:27:02.246764 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.246749 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:27:02.246824 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.246783 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:27:02.246864 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.246841 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:27:02.251053 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.249384 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mp542\"" Apr 20 14:27:02.251053 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.249491 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:27:02.251053 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.249503 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:27:02.251053 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.250130 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:27:02.251053 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.250187 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rfgjt\"" Apr 20 14:27:02.251359 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.251306 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:02.251431 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.251418 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.253401 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.253378 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:27:02.254295 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.254164 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:27:02.254295 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.254221 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:27:02.254376 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.254329 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:27:02.254376 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.254368 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:27:02.255168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.255150 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:27:02.255257 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.255190 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:27:02.255257 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.255205 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mgnbv\"" Apr 20 14:27:02.255464 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.255409 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bwrw8\"" Apr 20 14:27:02.255464 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.255420 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:27:02.257793 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.257775 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-tuned\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.257883 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.257807 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-cni-bin\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.257883 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.257833 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-hostroot\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.257883 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.257856 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-etc-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.258023 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.257880 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t769\" (UniqueName: \"kubernetes.io/projected/4828e4f0-6105-42f8-8ec4-54d66f7d101d-kube-api-access-7t769\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.258023 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.257905 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-socket-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.258023 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.257930 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-systemd\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.258023 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.257955 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-os-release\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.258023 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258003 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-systemd-units\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.258225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258036 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-cni-bin\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.258225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258063 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-ovnkube-script-lib\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.258225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258084 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcgbm\" (UniqueName: \"kubernetes.io/projected/a5549a94-73ae-4c4d-a853-281d46a86d49-kube-api-access-vcgbm\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.258225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258108 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e353688-0174-417f-9f72-f6ecdd5c06a2-host-slash\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.258225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258142 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-cni-multus\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.258225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258170 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-conf-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.258225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258189 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4529\" (UniqueName: \"kubernetes.io/projected/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-kube-api-access-r4529\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:02.258225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258214 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-host\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258237 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgfc\" (UniqueName: \"kubernetes.io/projected/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-kube-api-access-bfgfc\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258252 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqnvq\" (UniqueName: \"kubernetes.io/projected/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-kube-api-access-pqnvq\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258266 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-cnibin\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258330 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258364 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-cni-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258388 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-os-release\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258433 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258469 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-registration-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258495 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-etc-selinux\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258518 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-cni-netd\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.258558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258552 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-env-overrides\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258586 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-k8s-cni-cncf-io\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258611 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-kubelet\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258637 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-etc-kubernetes\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258661 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2kdp\" (UniqueName: \"kubernetes.io/projected/330b3bed-9410-44fc-9f4d-401c49180ff9-kube-api-access-k2kdp\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258683 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-cnibin\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258708 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-slash\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258752 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-tmp\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258774 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-multus-certs\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258797 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9dcm\" (UniqueName: \"kubernetes.io/projected/97b3ed92-b422-4a1d-bd36-49e50a7f088d-kube-api-access-p9dcm\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258821 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-var-lib-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258843 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258882 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-node-log\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258919 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-log-socket\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258944 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.258970 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-modprobe-d\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.259086 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259012 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-run-netns\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259036 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259052 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-ovnkube-config\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259068 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-cni-binary-copy\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259091 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-run\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259111 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-systemd\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259125 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-ovn\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259142 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4828e4f0-6105-42f8-8ec4-54d66f7d101d-hosts-file\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259164 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-kubernetes\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259185 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-var-lib-kubelet\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259217 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-socket-dir-parent\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259257 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259322 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259353 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-device-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259369 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-sys-fs\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259387 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b54dd67-12ec-486a-ac4c-dcdccdd01de9-konnectivity-ca\") pod \"konnectivity-agent-fz7q2\" (UID: \"6b54dd67-12ec-486a-ac4c-dcdccdd01de9\") " pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:02.259740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259421 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysctl-conf\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259454 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97b3ed92-b422-4a1d-bd36-49e50a7f088d-cni-binary-copy\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259480 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-system-cni-dir\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259504 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysctl-d\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259531 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e353688-0174-417f-9f72-f6ecdd5c06a2-iptables-alerter-script\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259553 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4828e4f0-6105-42f8-8ec4-54d66f7d101d-tmp-dir\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259575 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-host\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259604 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5549a94-73ae-4c4d-a853-281d46a86d49-ovn-node-metrics-cert\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259628 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-netns\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259654 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259677 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysconfig\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259693 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-lib-modules\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259714 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-daemon-config\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259774 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zccn\" (UniqueName: \"kubernetes.io/projected/9e353688-0174-417f-9f72-f6ecdd5c06a2-kube-api-access-5zccn\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259810 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259833 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-sys\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.260312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259857 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-system-cni-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.260909 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259881 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-kubelet\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.260909 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259904 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-serviceca\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.260909 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259931 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4lr\" (UniqueName: \"kubernetes.io/projected/322c3a65-5b88-428e-9a37-4f9727dbbbbd-kube-api-access-5g4lr\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.260909 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.259978 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b54dd67-12ec-486a-ac4c-dcdccdd01de9-agent-certs\") pod \"konnectivity-agent-fz7q2\" (UID: \"6b54dd67-12ec-486a-ac4c-dcdccdd01de9\") " pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:02.292657 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.292536 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:22:01 +0000 UTC" deadline="2028-01-13 15:14:24.955258766 +0000 UTC" Apr 20 14:27:02.292657 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.292564 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15192h47m22.662698062s" Apr 20 14:27:02.347660 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.347636 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:27:02.361094 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361065 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysctl-d\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.361207 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361102 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e353688-0174-417f-9f72-f6ecdd5c06a2-iptables-alerter-script\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.361207 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361141 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4828e4f0-6105-42f8-8ec4-54d66f7d101d-tmp-dir\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.361207 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361163 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-host\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.361368 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361215 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5549a94-73ae-4c4d-a853-281d46a86d49-ovn-node-metrics-cert\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.361368 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361240 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-netns\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.361368 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361295 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:02.361368 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361296 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysctl-d\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.361368 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361320 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-host\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.361368 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361347 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysconfig\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361389 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-netns\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361400 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysconfig\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361423 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-lib-modules\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361452 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-daemon-config\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361477 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zccn\" (UniqueName: \"kubernetes.io/projected/9e353688-0174-417f-9f72-f6ecdd5c06a2-kube-api-access-5zccn\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361492 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4828e4f0-6105-42f8-8ec4-54d66f7d101d-tmp-dir\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361502 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361544 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-sys\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361588 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-lib-modules\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.361639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361600 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-sys\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361696 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-system-cni-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361746 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-kubelet\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361750 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e353688-0174-417f-9f72-f6ecdd5c06a2-iptables-alerter-script\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361766 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361771 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-serviceca\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361754 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361801 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4lr\" (UniqueName: \"kubernetes.io/projected/322c3a65-5b88-428e-9a37-4f9727dbbbbd-kube-api-access-5g4lr\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361838 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b54dd67-12ec-486a-ac4c-dcdccdd01de9-agent-certs\") pod \"konnectivity-agent-fz7q2\" (UID: \"6b54dd67-12ec-486a-ac4c-dcdccdd01de9\") " pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361860 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-tuned\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361863 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-system-cni-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361881 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-cni-bin\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361903 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-hostroot\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361946 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-cni-bin\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361950 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-hostroot\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.361999 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-etc-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362039 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t769\" (UniqueName: \"kubernetes.io/projected/4828e4f0-6105-42f8-8ec4-54d66f7d101d-kube-api-access-7t769\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362042 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-kubelet\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362056 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-serviceca\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362083 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-etc-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362085 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-socket-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362254 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-socket-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362259 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-daemon-config\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362273 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-systemd\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362316 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-os-release\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362344 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-systemd-units\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362359 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-os-release\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362369 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-cni-bin\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362395 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-systemd-units\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362394 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-ovnkube-script-lib\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362426 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcgbm\" (UniqueName: \"kubernetes.io/projected/a5549a94-73ae-4c4d-a853-281d46a86d49-kube-api-access-vcgbm\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362436 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-cni-bin\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.362857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.362318 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-systemd\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.363816 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.363791 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-ovnkube-script-lib\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.363854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.363843 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e353688-0174-417f-9f72-f6ecdd5c06a2-host-slash\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.363893 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.363883 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-cni-multus\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.363928 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.363914 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-conf-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.363957 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.363938 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4529\" (UniqueName: \"kubernetes.io/projected/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-kube-api-access-r4529\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:02.363994 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.363966 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-host\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.364029 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.363996 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgfc\" (UniqueName: \"kubernetes.io/projected/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-kube-api-access-bfgfc\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.364069 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364026 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqnvq\" (UniqueName: \"kubernetes.io/projected/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-kube-api-access-pqnvq\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.364069 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364056 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-cnibin\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.364139 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364081 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:02.364139 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364115 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-cni-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.364195 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364145 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-os-release\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.364195 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364176 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.364261 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364207 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-registration-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.364261 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364237 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-etc-selinux\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.364328 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364263 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-cni-netd\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364328 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364292 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-env-overrides\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364328 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364321 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-k8s-cni-cncf-io\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.364420 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364351 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-kubelet\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.364420 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364380 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-etc-kubernetes\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.364420 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364406 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2kdp\" (UniqueName: \"kubernetes.io/projected/330b3bed-9410-44fc-9f4d-401c49180ff9-kube-api-access-k2kdp\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.364516 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364435 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-cnibin\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.364516 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364465 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-slash\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364516 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364493 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-tmp\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.364612 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364521 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-multus-certs\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.364612 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364551 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9dcm\" (UniqueName: \"kubernetes.io/projected/97b3ed92-b422-4a1d-bd36-49e50a7f088d-kube-api-access-p9dcm\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.364612 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364579 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-var-lib-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364709 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364611 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364709 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364640 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-node-log\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364709 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364678 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-log-socket\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364827 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364710 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364827 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-modprobe-d\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.364827 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364784 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-run-netns\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364827 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364814 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364944 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364844 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-ovnkube-config\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.364944 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364879 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-cni-binary-copy\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.364944 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364907 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-run\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.365056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364968 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-systemd\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.365056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.364991 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-ovn\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.365056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365022 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4828e4f0-6105-42f8-8ec4-54d66f7d101d-hosts-file\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.365056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365051 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-kubernetes\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.365234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365078 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-var-lib-kubelet\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.365234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365106 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-socket-dir-parent\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.365234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365132 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.365234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365162 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.365234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365190 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-device-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.365234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365220 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-sys-fs\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.365234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365221 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-tuned\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.365548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365246 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b54dd67-12ec-486a-ac4c-dcdccdd01de9-konnectivity-ca\") pod \"konnectivity-agent-fz7q2\" (UID: \"6b54dd67-12ec-486a-ac4c-dcdccdd01de9\") " pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:02.365548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365270 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysctl-conf\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.365548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365297 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97b3ed92-b422-4a1d-bd36-49e50a7f088d-cni-binary-copy\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.365548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365326 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-system-cni-dir\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.365548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365433 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-system-cni-dir\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.365548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365492 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-multus-certs\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.365854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365564 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e353688-0174-417f-9f72-f6ecdd5c06a2-host-slash\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.365854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365619 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-var-lib-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.365854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365686 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-conf-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.365854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365752 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-openvswitch\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.365854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365821 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-run-ovn-kubernetes\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.365854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365842 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-host\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.366115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365873 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6b54dd67-12ec-486a-ac4c-dcdccdd01de9-agent-certs\") pod \"konnectivity-agent-fz7q2\" (UID: \"6b54dd67-12ec-486a-ac4c-dcdccdd01de9\") " pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:02.366115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365895 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-run-netns\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.366115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365941 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.366115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366031 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-run\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.366296 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366156 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-cni-dir\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.366296 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366248 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-systemd\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.366296 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366258 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-os-release\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.366434 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366311 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-run-ovn\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.366434 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366322 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-cni-netd\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.366434 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366398 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4828e4f0-6105-42f8-8ec4-54d66f7d101d-hosts-file\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.366434 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366427 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.366613 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366454 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-kubernetes\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.366613 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366588 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-cni-multus\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.366705 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366628 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-node-log\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.366705 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.365941 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-modprobe-d\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.366895 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366875 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-env-overrides\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.366969 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.366947 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-cni-binary-copy\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.367073 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.367057 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:02.367128 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367102 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-etc-sysctl-conf\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.367178 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.367149 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs podName:5b3c9c26-01c0-40b2-ba38-e4b72ba81f66 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:02.867117951 +0000 UTC m=+3.046606416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs") pod "network-metrics-daemon-t787k" (UID: "5b3c9c26-01c0-40b2-ba38-e4b72ba81f66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:02.367337 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367318 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/330b3bed-9410-44fc-9f4d-401c49180ff9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.367582 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367560 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a5549a94-73ae-4c4d-a853-281d46a86d49-ovnkube-config\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.367660 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367597 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97b3ed92-b422-4a1d-bd36-49e50a7f088d-cni-binary-copy\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.367660 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367629 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-registration-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.367783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367658 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-var-lib-kubelet\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.367783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367676 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-etc-selinux\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.367783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367694 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-host-slash\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.367783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367714 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.367783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-run-k8s-cni-cncf-io\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.367783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367780 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-device-dir\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.368056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367799 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-host-var-lib-kubelet\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.368056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367829 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/322c3a65-5b88-428e-9a37-4f9727dbbbbd-sys-fs\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.368056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367840 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-etc-kubernetes\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.368056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.367956 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a5549a94-73ae-4c4d-a853-281d46a86d49-log-socket\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.368234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.368126 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-multus-socket-dir-parent\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.368286 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.368244 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6b54dd67-12ec-486a-ac4c-dcdccdd01de9-konnectivity-ca\") pod \"konnectivity-agent-fz7q2\" (UID: \"6b54dd67-12ec-486a-ac4c-dcdccdd01de9\") " pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:02.368469 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.368448 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97b3ed92-b422-4a1d-bd36-49e50a7f088d-cnibin\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.368580 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.368496 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/330b3bed-9410-44fc-9f4d-401c49180ff9-cnibin\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.368708 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.368678 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:02.368708 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.368704 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:02.368873 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.368717 2581 projected.go:194] Error preparing data for projected volume kube-api-access-zz5j5 for pod openshift-network-diagnostics/network-check-target-s2vgq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:02.368873 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.368793 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5 podName:9ee0de80-7894-40f1-bba0-63975a8765c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:02.868777328 +0000 UTC m=+3.048265788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zz5j5" (UniqueName: "kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5") pod "network-check-target-s2vgq" (UID: "9ee0de80-7894-40f1-bba0-63975a8765c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:02.369779 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.369759 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-tmp\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.371414 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.371369 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zccn\" (UniqueName: \"kubernetes.io/projected/9e353688-0174-417f-9f72-f6ecdd5c06a2-kube-api-access-5zccn\") pod \"iptables-alerter-gfsfz\" (UID: \"9e353688-0174-417f-9f72-f6ecdd5c06a2\") " pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.371636 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.371599 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a5549a94-73ae-4c4d-a853-281d46a86d49-ovn-node-metrics-cert\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.372473 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.372448 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4lr\" (UniqueName: \"kubernetes.io/projected/322c3a65-5b88-428e-9a37-4f9727dbbbbd-kube-api-access-5g4lr\") pod \"aws-ebs-csi-driver-node-5fv2c\" (UID: \"322c3a65-5b88-428e-9a37-4f9727dbbbbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.373854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.373835 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcgbm\" (UniqueName: \"kubernetes.io/projected/a5549a94-73ae-4c4d-a853-281d46a86d49-kube-api-access-vcgbm\") pod \"ovnkube-node-r8xsm\" (UID: \"a5549a94-73ae-4c4d-a853-281d46a86d49\") " pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.374197 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.374176 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t769\" (UniqueName: \"kubernetes.io/projected/4828e4f0-6105-42f8-8ec4-54d66f7d101d-kube-api-access-7t769\") pod \"node-resolver-lmkks\" (UID: \"4828e4f0-6105-42f8-8ec4-54d66f7d101d\") " pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.376177 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.376147 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9dcm\" (UniqueName: \"kubernetes.io/projected/97b3ed92-b422-4a1d-bd36-49e50a7f088d-kube-api-access-p9dcm\") pod \"multus-jhhn2\" (UID: \"97b3ed92-b422-4a1d-bd36-49e50a7f088d\") " pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.376422 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.376405 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4529\" (UniqueName: \"kubernetes.io/projected/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-kube-api-access-r4529\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:02.379238 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.379220 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgfc\" (UniqueName: \"kubernetes.io/projected/06ea58d8-2345-4053-8f38-f78ea9cdf2c9-kube-api-access-bfgfc\") pod \"tuned-wbpln\" (UID: \"06ea58d8-2345-4053-8f38-f78ea9cdf2c9\") " pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.379701 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.379679 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqnvq\" (UniqueName: \"kubernetes.io/projected/9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c-kube-api-access-pqnvq\") pod \"node-ca-q5vqc\" (UID: \"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c\") " pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.380020 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.380003 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2kdp\" (UniqueName: \"kubernetes.io/projected/330b3bed-9410-44fc-9f4d-401c49180ff9-kube-api-access-k2kdp\") pod \"multus-additional-cni-plugins-ddt9k\" (UID: \"330b3bed-9410-44fc-9f4d-401c49180ff9\") " pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.558614 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.558521 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gfsfz" Apr 20 14:27:02.565342 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.565316 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wbpln" Apr 20 14:27:02.572037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.572016 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lmkks" Apr 20 14:27:02.578404 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.578375 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q5vqc" Apr 20 14:27:02.583955 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.583923 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" Apr 20 14:27:02.589522 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.589506 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jhhn2" Apr 20 14:27:02.595632 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.595593 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" Apr 20 14:27:02.602350 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.602331 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:02.603712 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.603695 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:27:02.608797 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.608780 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:02.751604 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.751574 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:27:02.869262 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.869181 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:02.869262 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:02.869235 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:02.869455 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.869358 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:02.869455 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.869363 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:02.869455 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.869383 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:02.869455 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.869392 2581 projected.go:194] Error preparing data for projected volume kube-api-access-zz5j5 for pod openshift-network-diagnostics/network-check-target-s2vgq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:02.869455 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.869432 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs podName:5b3c9c26-01c0-40b2-ba38-e4b72ba81f66 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:03.869411775 +0000 UTC m=+4.048900228 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs") pod "network-metrics-daemon-t787k" (UID: "5b3c9c26-01c0-40b2-ba38-e4b72ba81f66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:02.869648 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:02.869452 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5 podName:9ee0de80-7894-40f1-bba0-63975a8765c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:03.86944194 +0000 UTC m=+4.048930394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zz5j5" (UniqueName: "kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5") pod "network-check-target-s2vgq" (UID: "9ee0de80-7894-40f1-bba0-63975a8765c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:02.986890 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.986853 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b54dd67_12ec_486a_ac4c_dcdccdd01de9.slice/crio-e8a5c846fe8a25b25ac9db5ddfe271970238cf288e883afb48fd7be1f1f69e44 WatchSource:0}: Error finding container e8a5c846fe8a25b25ac9db5ddfe271970238cf288e883afb48fd7be1f1f69e44: Status 404 returned error can't find the container with id e8a5c846fe8a25b25ac9db5ddfe271970238cf288e883afb48fd7be1f1f69e44 Apr 20 14:27:02.987603 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.987581 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod322c3a65_5b88_428e_9a37_4f9727dbbbbd.slice/crio-a1c49f40f60fab540a6ca57d61bdf41446617e8e6baee06a81bf0657d02665d6 WatchSource:0}: Error finding container a1c49f40f60fab540a6ca57d61bdf41446617e8e6baee06a81bf0657d02665d6: Status 404 returned error can't find the container with id a1c49f40f60fab540a6ca57d61bdf41446617e8e6baee06a81bf0657d02665d6 Apr 20 14:27:02.989474 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.989423 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5549a94_73ae_4c4d_a853_281d46a86d49.slice/crio-ff958f1b105de5bcdf4d91507123ae40a11c04993e9570836043053a12a7c19e WatchSource:0}: Error finding container ff958f1b105de5bcdf4d91507123ae40a11c04993e9570836043053a12a7c19e: Status 404 returned error can't find the container with id ff958f1b105de5bcdf4d91507123ae40a11c04993e9570836043053a12a7c19e Apr 20 14:27:02.991934 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.991908 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e353688_0174_417f_9f72_f6ecdd5c06a2.slice/crio-a678c5f5668d8e44931fd72e3654f7cef9ae63d10f77b15c1a5283424a912ed2 WatchSource:0}: Error finding container a678c5f5668d8e44931fd72e3654f7cef9ae63d10f77b15c1a5283424a912ed2: Status 404 returned error can't find the container with id a678c5f5668d8e44931fd72e3654f7cef9ae63d10f77b15c1a5283424a912ed2 Apr 20 14:27:02.994247 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.993138 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod330b3bed_9410_44fc_9f4d_401c49180ff9.slice/crio-90ad71b17cffbaae26de335da0ed51107f1824823ab08bbc0482a92a4a4f4fd2 WatchSource:0}: Error finding container 90ad71b17cffbaae26de335da0ed51107f1824823ab08bbc0482a92a4a4f4fd2: Status 404 returned error can't find the container with id 90ad71b17cffbaae26de335da0ed51107f1824823ab08bbc0482a92a4a4f4fd2 Apr 20 14:27:02.994247 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.994082 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06ea58d8_2345_4053_8f38_f78ea9cdf2c9.slice/crio-ed2ddaff5ff5d484289a5a7b83b10f44e7846cab6221bd8cc56104fb76da8792 WatchSource:0}: Error finding container ed2ddaff5ff5d484289a5a7b83b10f44e7846cab6221bd8cc56104fb76da8792: Status 404 returned error can't find the container with id ed2ddaff5ff5d484289a5a7b83b10f44e7846cab6221bd8cc56104fb76da8792 Apr 20 14:27:02.994819 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.994793 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b3ed92_b422_4a1d_bd36_49e50a7f088d.slice/crio-6a437d2933c519d1325b9e7bfd096101591fb104f8e3b0817a45f4886bfe3830 WatchSource:0}: Error finding container 6a437d2933c519d1325b9e7bfd096101591fb104f8e3b0817a45f4886bfe3830: Status 404 returned error can't find the container with id 6a437d2933c519d1325b9e7bfd096101591fb104f8e3b0817a45f4886bfe3830 Apr 20 14:27:02.995948 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.995857 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4828e4f0_6105_42f8_8ec4_54d66f7d101d.slice/crio-c25c6b9e933fb864e4ad805da848a90acb5705d0725a44d82d6ebd25ba403585 WatchSource:0}: Error finding container c25c6b9e933fb864e4ad805da848a90acb5705d0725a44d82d6ebd25ba403585: Status 404 returned error can't find the container with id c25c6b9e933fb864e4ad805da848a90acb5705d0725a44d82d6ebd25ba403585 Apr 20 14:27:02.997276 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:02.997250 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d52b0cc_f4c2_4ce7_a3ab_7dce264fc27c.slice/crio-005c8bfc0ea1a1f19a1df79c20212e53389ae1437f8662414e6605004fb48ac4 WatchSource:0}: Error finding container 005c8bfc0ea1a1f19a1df79c20212e53389ae1437f8662414e6605004fb48ac4: Status 404 returned error can't find the container with id 005c8bfc0ea1a1f19a1df79c20212e53389ae1437f8662414e6605004fb48ac4 Apr 20 14:27:03.293648 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.293609 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:22:01 +0000 UTC" deadline="2027-11-14 20:43:30.792912306 +0000 UTC" Apr 20 14:27:03.293648 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.293644 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13758h16m27.499271513s" Apr 20 14:27:03.383366 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.383327 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:03.383541 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:03.383459 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:03.398192 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.398158 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" event={"ID":"57b16f66466a195429f73bc1a0dcec09","Type":"ContainerStarted","Data":"aa7c1ca644e3b57e4837bbb3b98971c3098cd1b807dd4b003c5ff3375c34d825"} Apr 20 14:27:03.401604 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.401565 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q5vqc" event={"ID":"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c","Type":"ContainerStarted","Data":"005c8bfc0ea1a1f19a1df79c20212e53389ae1437f8662414e6605004fb48ac4"} Apr 20 14:27:03.402857 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.402824 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lmkks" event={"ID":"4828e4f0-6105-42f8-8ec4-54d66f7d101d","Type":"ContainerStarted","Data":"c25c6b9e933fb864e4ad805da848a90acb5705d0725a44d82d6ebd25ba403585"} Apr 20 14:27:03.404467 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.404441 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gfsfz" event={"ID":"9e353688-0174-417f-9f72-f6ecdd5c06a2","Type":"ContainerStarted","Data":"a678c5f5668d8e44931fd72e3654f7cef9ae63d10f77b15c1a5283424a912ed2"} Apr 20 14:27:03.406256 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.406231 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"ff958f1b105de5bcdf4d91507123ae40a11c04993e9570836043053a12a7c19e"} Apr 20 14:27:03.407321 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.407300 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fz7q2" event={"ID":"6b54dd67-12ec-486a-ac4c-dcdccdd01de9","Type":"ContainerStarted","Data":"e8a5c846fe8a25b25ac9db5ddfe271970238cf288e883afb48fd7be1f1f69e44"} Apr 20 14:27:03.416411 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.416389 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jhhn2" event={"ID":"97b3ed92-b422-4a1d-bd36-49e50a7f088d","Type":"ContainerStarted","Data":"6a437d2933c519d1325b9e7bfd096101591fb104f8e3b0817a45f4886bfe3830"} Apr 20 14:27:03.419661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.419625 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wbpln" event={"ID":"06ea58d8-2345-4053-8f38-f78ea9cdf2c9","Type":"ContainerStarted","Data":"ed2ddaff5ff5d484289a5a7b83b10f44e7846cab6221bd8cc56104fb76da8792"} Apr 20 14:27:03.422110 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.422056 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" event={"ID":"330b3bed-9410-44fc-9f4d-401c49180ff9","Type":"ContainerStarted","Data":"90ad71b17cffbaae26de335da0ed51107f1824823ab08bbc0482a92a4a4f4fd2"} Apr 20 14:27:03.431343 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.431305 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" event={"ID":"322c3a65-5b88-428e-9a37-4f9727dbbbbd","Type":"ContainerStarted","Data":"a1c49f40f60fab540a6ca57d61bdf41446617e8e6baee06a81bf0657d02665d6"} Apr 20 14:27:03.877151 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.877115 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:03.877297 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:03.877190 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:03.877396 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:03.877378 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:03.877462 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:03.877405 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:03.877462 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:03.877418 2581 projected.go:194] Error preparing data for projected volume kube-api-access-zz5j5 for pod openshift-network-diagnostics/network-check-target-s2vgq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:03.877563 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:03.877469 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5 podName:9ee0de80-7894-40f1-bba0-63975a8765c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:05.877451091 +0000 UTC m=+6.056939551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zz5j5" (UniqueName: "kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5") pod "network-check-target-s2vgq" (UID: "9ee0de80-7894-40f1-bba0-63975a8765c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:03.877911 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:03.877865 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:03.878011 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:03.877915 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs podName:5b3c9c26-01c0-40b2-ba38-e4b72ba81f66 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:05.877900405 +0000 UTC m=+6.057388855 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs") pod "network-metrics-daemon-t787k" (UID: "5b3c9c26-01c0-40b2-ba38-e4b72ba81f66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:04.385636 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:04.385598 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:04.386104 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:04.385778 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:04.446750 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:04.446304 2581 generic.go:358] "Generic (PLEG): container finished" podID="c05921670fca4842dd48b5deb56ad8b1" containerID="a398d02e86f19d0a065ba706b84ad5db9e421a809a2de8a98f8ff632a4a7e45f" exitCode=0 Apr 20 14:27:04.446750 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:04.446445 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" event={"ID":"c05921670fca4842dd48b5deb56ad8b1","Type":"ContainerDied","Data":"a398d02e86f19d0a065ba706b84ad5db9e421a809a2de8a98f8ff632a4a7e45f"} Apr 20 14:27:04.462541 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:04.462230 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-136.ec2.internal" podStartSLOduration=3.462214689 podStartE2EDuration="3.462214689s" podCreationTimestamp="2026-04-20 14:27:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:27:03.413844834 +0000 UTC m=+3.593333303" watchObservedRunningTime="2026-04-20 14:27:04.462214689 +0000 UTC m=+4.641703158" Apr 20 14:27:05.382963 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:05.382681 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:05.383132 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:05.383071 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:05.453309 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:05.453277 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" event={"ID":"c05921670fca4842dd48b5deb56ad8b1","Type":"ContainerStarted","Data":"ee2a2e40c16dbc8f7c852a4c35a4bdf41888d7b1b80be50966f1011211248b34"} Apr 20 14:27:05.897057 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:05.897024 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:05.897238 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:05.897094 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:05.897238 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:05.897234 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:05.897344 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:05.897250 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:05.897344 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:05.897263 2581 projected.go:194] Error preparing data for projected volume kube-api-access-zz5j5 for pod openshift-network-diagnostics/network-check-target-s2vgq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:05.897344 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:05.897305 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5 podName:9ee0de80-7894-40f1-bba0-63975a8765c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:09.897292195 +0000 UTC m=+10.076780642 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zz5j5" (UniqueName: "kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5") pod "network-check-target-s2vgq" (UID: "9ee0de80-7894-40f1-bba0-63975a8765c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:05.897597 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:05.897583 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:05.897662 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:05.897618 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs podName:5b3c9c26-01c0-40b2-ba38-e4b72ba81f66 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:09.897609665 +0000 UTC m=+10.077098111 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs") pod "network-metrics-daemon-t787k" (UID: "5b3c9c26-01c0-40b2-ba38-e4b72ba81f66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:06.384017 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:06.383492 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:06.384017 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:06.383626 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:07.382846 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:07.382816 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:07.383266 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:07.382940 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:08.382920 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:08.382887 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:08.383492 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:08.383016 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:09.383373 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:09.383338 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:09.383874 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:09.383461 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:09.933426 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:09.933346 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:09.933426 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:09.933420 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:09.933654 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:09.933546 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:09.933654 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:09.933590 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs podName:5b3c9c26-01c0-40b2-ba38-e4b72ba81f66 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:17.933575781 +0000 UTC m=+18.113064226 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs") pod "network-metrics-daemon-t787k" (UID: "5b3c9c26-01c0-40b2-ba38-e4b72ba81f66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:09.933772 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:09.933675 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:09.933772 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:09.933689 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:09.933772 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:09.933700 2581 projected.go:194] Error preparing data for projected volume kube-api-access-zz5j5 for pod openshift-network-diagnostics/network-check-target-s2vgq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:09.933772 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:09.933741 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5 podName:9ee0de80-7894-40f1-bba0-63975a8765c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:17.933715653 +0000 UTC m=+18.113204100 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zz5j5" (UniqueName: "kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5") pod "network-check-target-s2vgq" (UID: "9ee0de80-7894-40f1-bba0-63975a8765c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:10.383912 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:10.383877 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:10.384393 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:10.384002 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:11.383171 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:11.383138 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:11.383364 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:11.383265 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:12.382969 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:12.382939 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:12.383423 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:12.383046 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:13.382635 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:13.382605 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:13.382800 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:13.382704 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:14.382695 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:14.382647 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:14.383139 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:14.382805 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:15.383109 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:15.383033 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:15.383644 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:15.383153 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:16.383328 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:16.383292 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:16.383773 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:16.383423 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:17.383323 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:17.383286 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:17.383491 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:17.383404 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:17.989139 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:17.989099 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:17.989339 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:17.989166 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:17.989339 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:17.989263 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:17.989339 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:17.989283 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:17.989339 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:17.989293 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:17.989339 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:17.989296 2581 projected.go:194] Error preparing data for projected volume kube-api-access-zz5j5 for pod openshift-network-diagnostics/network-check-target-s2vgq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:17.989543 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:17.989350 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5 podName:9ee0de80-7894-40f1-bba0-63975a8765c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:33.989332967 +0000 UTC m=+34.168821414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zz5j5" (UniqueName: "kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5") pod "network-check-target-s2vgq" (UID: "9ee0de80-7894-40f1-bba0-63975a8765c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:17.989543 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:17.989369 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs podName:5b3c9c26-01c0-40b2-ba38-e4b72ba81f66 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:33.989360347 +0000 UTC m=+34.168848798 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs") pod "network-metrics-daemon-t787k" (UID: "5b3c9c26-01c0-40b2-ba38-e4b72ba81f66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:18.383604 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:18.383524 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:18.384201 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:18.383651 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:19.383218 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:19.383183 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:19.383394 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:19.383305 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:20.383887 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:20.383863 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:20.384199 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:20.383945 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:21.383611 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.383236 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:21.383776 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:21.383652 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:21.477959 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.477921 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jhhn2" event={"ID":"97b3ed92-b422-4a1d-bd36-49e50a7f088d","Type":"ContainerStarted","Data":"a0fb017c9897ce19290f9d4840185d0d274ffe8e55be676eb263dc6f497c55ca"} Apr 20 14:27:21.480279 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.479885 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wbpln" event={"ID":"06ea58d8-2345-4053-8f38-f78ea9cdf2c9","Type":"ContainerStarted","Data":"d502f0bdeed603374d27448ce09a6880b015725b377622cf4c80c5608490c166"} Apr 20 14:27:21.481523 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.481495 2581 generic.go:358] "Generic (PLEG): container finished" podID="330b3bed-9410-44fc-9f4d-401c49180ff9" containerID="823625b55108ee34d649678b684c8750c5ef118d2f7a2224efe3809a33332917" exitCode=0 Apr 20 14:27:21.481652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.481566 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" event={"ID":"330b3bed-9410-44fc-9f4d-401c49180ff9","Type":"ContainerDied","Data":"823625b55108ee34d649678b684c8750c5ef118d2f7a2224efe3809a33332917"} Apr 20 14:27:21.483956 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.483933 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" event={"ID":"322c3a65-5b88-428e-9a37-4f9727dbbbbd","Type":"ContainerStarted","Data":"5b336d69b93571a29c81b0e9949c9da60880d70128dba8538fb80e918313bdb1"} Apr 20 14:27:21.485174 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.485154 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q5vqc" event={"ID":"9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c","Type":"ContainerStarted","Data":"e7bf411c89f6b8b6eeec7b9920038c34a81e0a175dda0ba789d34ccc630b035e"} Apr 20 14:27:21.486315 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.486292 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lmkks" event={"ID":"4828e4f0-6105-42f8-8ec4-54d66f7d101d","Type":"ContainerStarted","Data":"bf4cf3bfa7e483bb7475f59f4f5726ff456376ad1ba885e6aed4cc2aa988e20d"} Apr 20 14:27:21.489520 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.489502 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:27:21.491037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.489929 2581 generic.go:358] "Generic (PLEG): container finished" podID="a5549a94-73ae-4c4d-a853-281d46a86d49" containerID="18b3e6e392f1e1072bcbe87a4f616b5cf3e8adb2048e34e560a8eea97f454296" exitCode=1 Apr 20 14:27:21.491037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.489987 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"83b3e95d36bb0690ac676ae00db5488ab36354569d277134db761bccb3190615"} Apr 20 14:27:21.491037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.490019 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"4a1ce9a33785634bae91939533ae153f86e3623d22aba34604d7a05a4e98db57"} Apr 20 14:27:21.491037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.490033 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"0757811d699ede45f4d12b4fad6793f7dcc197f24c32c5bae7471c583da8a5e6"} Apr 20 14:27:21.491037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.490045 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"95bf413352307c29101a6cceee074400b0f1d25d90cdc27b2bcd3de7f241e7ef"} Apr 20 14:27:21.491037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.490058 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerDied","Data":"18b3e6e392f1e1072bcbe87a4f616b5cf3e8adb2048e34e560a8eea97f454296"} Apr 20 14:27:21.491037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.490070 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"eb07035fe08abf22b67b6f221af128cc59c2d1565653128424eba9c3144122e4"} Apr 20 14:27:21.492952 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.492397 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fz7q2" event={"ID":"6b54dd67-12ec-486a-ac4c-dcdccdd01de9","Type":"ContainerStarted","Data":"4b1668239d0ce29ede5fa439c4e2d5aa8cb1403bf28d89a9b7b9f3acfe77747a"} Apr 20 14:27:21.509612 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.509550 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-136.ec2.internal" podStartSLOduration=20.50953434 podStartE2EDuration="20.50953434s" podCreationTimestamp="2026-04-20 14:27:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:27:05.469694636 +0000 UTC m=+5.649183103" watchObservedRunningTime="2026-04-20 14:27:21.50953434 +0000 UTC m=+21.689022809" Apr 20 14:27:21.509794 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.509769 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jhhn2" podStartSLOduration=3.978925694 podStartE2EDuration="21.509758581s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.997213963 +0000 UTC m=+3.176702413" lastFinishedPulling="2026-04-20 14:27:20.528046852 +0000 UTC m=+20.707535300" observedRunningTime="2026-04-20 14:27:21.509316584 +0000 UTC m=+21.688805053" watchObservedRunningTime="2026-04-20 14:27:21.509758581 +0000 UTC m=+21.689247053" Apr 20 14:27:21.528606 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.528568 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fz7q2" podStartSLOduration=8.930472888 podStartE2EDuration="21.528558294s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.989605167 +0000 UTC m=+3.169093612" lastFinishedPulling="2026-04-20 14:27:15.587690572 +0000 UTC m=+15.767179018" observedRunningTime="2026-04-20 14:27:21.528448006 +0000 UTC m=+21.707936473" watchObservedRunningTime="2026-04-20 14:27:21.528558294 +0000 UTC m=+21.708046763" Apr 20 14:27:21.544664 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.544618 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wbpln" podStartSLOduration=4.030820012 podStartE2EDuration="21.544601649s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.996420964 +0000 UTC m=+3.175909412" lastFinishedPulling="2026-04-20 14:27:20.510202596 +0000 UTC m=+20.689691049" observedRunningTime="2026-04-20 14:27:21.544555273 +0000 UTC m=+21.724043743" watchObservedRunningTime="2026-04-20 14:27:21.544601649 +0000 UTC m=+21.724090117" Apr 20 14:27:21.576229 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.576178 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lmkks" podStartSLOduration=4.061566376 podStartE2EDuration="21.576161597s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.997829045 +0000 UTC m=+3.177317491" lastFinishedPulling="2026-04-20 14:27:20.51242425 +0000 UTC m=+20.691912712" observedRunningTime="2026-04-20 14:27:21.575979599 +0000 UTC m=+21.755468069" watchObservedRunningTime="2026-04-20 14:27:21.576161597 +0000 UTC m=+21.755650067" Apr 20 14:27:21.576343 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:21.576263 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q5vqc" podStartSLOduration=4.108687171 podStartE2EDuration="21.576254994s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.998570774 +0000 UTC m=+3.178059220" lastFinishedPulling="2026-04-20 14:27:20.466138593 +0000 UTC m=+20.645627043" observedRunningTime="2026-04-20 14:27:21.558586375 +0000 UTC m=+21.738074846" watchObservedRunningTime="2026-04-20 14:27:21.576254994 +0000 UTC m=+21.755743464" Apr 20 14:27:22.241555 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:22.241473 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:27:22.333273 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:22.333187 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:27:22.24148912Z","UUID":"75bfb584-1fb7-4311-a506-aecb4282f652","Handler":null,"Name":"","Endpoint":""} Apr 20 14:27:22.336318 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:22.336104 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:27:22.336318 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:22.336132 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:27:22.382621 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:22.382599 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:22.382765 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:22.382743 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:22.495981 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:22.495908 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gfsfz" event={"ID":"9e353688-0174-417f-9f72-f6ecdd5c06a2","Type":"ContainerStarted","Data":"d5ec8491df760b14d3a230bc0deb79cb62a7f40918eacde8467bde1c17af7eff"} Apr 20 14:27:22.497854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:22.497810 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" event={"ID":"322c3a65-5b88-428e-9a37-4f9727dbbbbd","Type":"ContainerStarted","Data":"68437d0636a61fc3d122f323e7a63f417137c86e4ccc583bfb128606c647dc89"} Apr 20 14:27:22.513830 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:22.513777 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gfsfz" podStartSLOduration=4.9978555270000005 podStartE2EDuration="22.513765095s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.994158682 +0000 UTC m=+3.173647128" lastFinishedPulling="2026-04-20 14:27:20.510068243 +0000 UTC m=+20.689556696" observedRunningTime="2026-04-20 14:27:22.513534471 +0000 UTC m=+22.693022941" watchObservedRunningTime="2026-04-20 14:27:22.513765095 +0000 UTC m=+22.693253562" Apr 20 14:27:23.382995 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:23.382962 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:23.383198 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:23.383084 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:24.382820 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:24.382786 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:24.383409 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:24.382920 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:24.518436 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:24.518407 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:27:24.518864 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:24.518833 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"107a78da9f9f4402d4d26e104159729bf81a52681358bfaad20c2576751f42d8"} Apr 20 14:27:24.520851 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:24.520822 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" event={"ID":"322c3a65-5b88-428e-9a37-4f9727dbbbbd","Type":"ContainerStarted","Data":"01a8845462f9b7e0d138c3c880a993ad48daaee747e2b65d6bece30939d3bb04"} Apr 20 14:27:24.538254 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:24.538198 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5fv2c" podStartSLOduration=4.094852562 podStartE2EDuration="24.538181727s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.99082984 +0000 UTC m=+3.170318287" lastFinishedPulling="2026-04-20 14:27:23.434159006 +0000 UTC m=+23.613647452" observedRunningTime="2026-04-20 14:27:24.53635564 +0000 UTC m=+24.715844107" watchObservedRunningTime="2026-04-20 14:27:24.538181727 +0000 UTC m=+24.717670197" Apr 20 14:27:25.382840 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:25.382812 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:25.383231 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:25.382915 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:25.741791 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:25.741380 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:25.742501 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:25.742481 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:26.382534 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.382503 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:26.382679 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:26.382606 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:26.526189 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.526157 2581 generic.go:358] "Generic (PLEG): container finished" podID="330b3bed-9410-44fc-9f4d-401c49180ff9" containerID="98a0f37988b999f8a41ba4f856c1b2e9e9d2b7c6efe53f2e210c3cb58774baad" exitCode=0 Apr 20 14:27:26.526682 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.526235 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" event={"ID":"330b3bed-9410-44fc-9f4d-401c49180ff9","Type":"ContainerDied","Data":"98a0f37988b999f8a41ba4f856c1b2e9e9d2b7c6efe53f2e210c3cb58774baad"} Apr 20 14:27:26.529535 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.529520 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:27:26.529963 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.529931 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"888374ab42cd384c3a2a0771b77977f548aefe52d1b120b7bbe31bfa8f6e70b2"} Apr 20 14:27:26.530172 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.530156 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:26.530272 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.530178 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:26.530272 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.530192 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:26.530272 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.530204 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:26.530399 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.530373 2581 scope.go:117] "RemoveContainer" containerID="18b3e6e392f1e1072bcbe87a4f616b5cf3e8adb2048e34e560a8eea97f454296" Apr 20 14:27:26.530686 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.530666 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fz7q2" Apr 20 14:27:26.546108 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.546089 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:26.546618 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:26.546598 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:27:27.382860 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.382689 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:27.382995 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:27.382942 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:27.534102 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.534081 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:27:27.534647 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.534458 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" event={"ID":"a5549a94-73ae-4c4d-a853-281d46a86d49","Type":"ContainerStarted","Data":"84e051548fd142f5a3f50b54d192c8d2f4589b6fe85c5eb49ad53b8b9730a8f0"} Apr 20 14:27:27.536225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.536200 2581 generic.go:358] "Generic (PLEG): container finished" podID="330b3bed-9410-44fc-9f4d-401c49180ff9" containerID="bea0f8d674bde002b1841668c16d171d8574ad0390fe1da51dc783cbd36bbd99" exitCode=0 Apr 20 14:27:27.536312 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.536282 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" event={"ID":"330b3bed-9410-44fc-9f4d-401c49180ff9","Type":"ContainerDied","Data":"bea0f8d674bde002b1841668c16d171d8574ad0390fe1da51dc783cbd36bbd99"} Apr 20 14:27:27.560545 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.560505 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" podStartSLOduration=9.996933671 podStartE2EDuration="27.560493346s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.991842326 +0000 UTC m=+3.171330772" lastFinishedPulling="2026-04-20 14:27:20.555402001 +0000 UTC m=+20.734890447" observedRunningTime="2026-04-20 14:27:27.559122354 +0000 UTC m=+27.738610822" watchObservedRunningTime="2026-04-20 14:27:27.560493346 +0000 UTC m=+27.739981814" Apr 20 14:27:27.603841 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.603818 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s2vgq"] Apr 20 14:27:27.603944 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.603916 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:27.604007 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:27.603990 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:27.606719 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.606697 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t787k"] Apr 20 14:27:27.606827 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:27.606790 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:27.606877 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:27.606865 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:28.540278 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:28.540244 2581 generic.go:358] "Generic (PLEG): container finished" podID="330b3bed-9410-44fc-9f4d-401c49180ff9" containerID="2ce76bea7326886b8cbf2281c672e975fa5fa45c4d1f25280fdaac2d760be16e" exitCode=0 Apr 20 14:27:28.540687 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:28.540335 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" event={"ID":"330b3bed-9410-44fc-9f4d-401c49180ff9","Type":"ContainerDied","Data":"2ce76bea7326886b8cbf2281c672e975fa5fa45c4d1f25280fdaac2d760be16e"} Apr 20 14:27:29.383093 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:29.383062 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:29.383247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:29.383133 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:29.383315 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:29.383256 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:29.383397 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:29.383371 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:31.382838 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:31.382803 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:31.383406 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:31.382817 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:31.383406 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:31.382921 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:31.383406 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:31.383021 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:33.383637 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.383422 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:33.384048 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.383425 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:33.384048 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:33.383741 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s2vgq" podUID="9ee0de80-7894-40f1-bba0-63975a8765c3" Apr 20 14:27:33.384048 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:33.383840 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t787k" podUID="5b3c9c26-01c0-40b2-ba38-e4b72ba81f66" Apr 20 14:27:33.681063 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.680949 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-136.ec2.internal" event="NodeReady" Apr 20 14:27:33.681208 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.681092 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:27:33.723085 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.723052 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5qgnp"] Apr 20 14:27:33.727608 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.727578 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:33.729908 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.729881 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5k2mg\"" Apr 20 14:27:33.730087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.729934 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:27:33.730087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.729883 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:27:33.732543 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.732522 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-htgwf"] Apr 20 14:27:33.736143 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.736122 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:33.737304 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.737284 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5qgnp"] Apr 20 14:27:33.738385 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.738366 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:27:33.738479 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.738403 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-79jcr\"" Apr 20 14:27:33.738667 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.738627 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:27:33.738781 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.738629 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:27:33.741576 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.741556 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-htgwf"] Apr 20 14:27:33.909823 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.909783 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce3a0704-031d-43e8-87ac-e53039d6f376-config-volume\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:33.909823 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.909825 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:33.910058 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.909848 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ce3a0704-031d-43e8-87ac-e53039d6f376-tmp-dir\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:33.910058 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.909883 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:33.910058 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.909899 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97chl\" (UniqueName: \"kubernetes.io/projected/ce3a0704-031d-43e8-87ac-e53039d6f376-kube-api-access-97chl\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:33.910058 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:33.909924 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57qhv\" (UniqueName: \"kubernetes.io/projected/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-kube-api-access-57qhv\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:34.011070 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.011038 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:34.011260 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.011100 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:34.011260 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.011124 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce3a0704-031d-43e8-87ac-e53039d6f376-config-volume\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:34.011260 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.011148 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:34.011260 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.011169 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ce3a0704-031d-43e8-87ac-e53039d6f376-tmp-dir\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:34.011260 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.011198 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:34.011260 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.011218 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97chl\" (UniqueName: \"kubernetes.io/projected/ce3a0704-031d-43e8-87ac-e53039d6f376-kube-api-access-97chl\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:34.011260 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.011250 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57qhv\" (UniqueName: \"kubernetes.io/projected/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-kube-api-access-57qhv\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:34.011641 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.011619 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:34.011641 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.011643 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:34.011845 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.011655 2581 projected.go:194] Error preparing data for projected volume kube-api-access-zz5j5 for pod openshift-network-diagnostics/network-check-target-s2vgq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:34.011845 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.011697 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5 podName:9ee0de80-7894-40f1-bba0-63975a8765c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:06.011682897 +0000 UTC m=+66.191171343 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zz5j5" (UniqueName: "kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5") pod "network-check-target-s2vgq" (UID: "9ee0de80-7894-40f1-bba0-63975a8765c3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:34.012096 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.012077 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:34.012164 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.012129 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs podName:5b3c9c26-01c0-40b2-ba38-e4b72ba81f66 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:06.012114675 +0000 UTC m=+66.191603123 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs") pod "network-metrics-daemon-t787k" (UID: "5b3c9c26-01c0-40b2-ba38-e4b72ba81f66") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:34.013427 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.012708 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce3a0704-031d-43e8-87ac-e53039d6f376-config-volume\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:34.013427 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.012821 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:34.013427 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.012863 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls podName:ce3a0704-031d-43e8-87ac-e53039d6f376 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:34.512848371 +0000 UTC m=+34.692336817 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls") pod "dns-default-5qgnp" (UID: "ce3a0704-031d-43e8-87ac-e53039d6f376") : secret "dns-default-metrics-tls" not found Apr 20 14:27:34.013427 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.013106 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ce3a0704-031d-43e8-87ac-e53039d6f376-tmp-dir\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:34.013427 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.013183 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:34.013427 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.013215 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert podName:e5f2a8a7-39d4-41d8-9ca2-1a049023a466 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:34.513205632 +0000 UTC m=+34.692694078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert") pod "ingress-canary-htgwf" (UID: "e5f2a8a7-39d4-41d8-9ca2-1a049023a466") : secret "canary-serving-cert" not found Apr 20 14:27:34.022711 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.022688 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57qhv\" (UniqueName: \"kubernetes.io/projected/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-kube-api-access-57qhv\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:34.023096 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.023078 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97chl\" (UniqueName: \"kubernetes.io/projected/ce3a0704-031d-43e8-87ac-e53039d6f376-kube-api-access-97chl\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:34.514528 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.514498 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:34.514917 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.514538 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:34.514917 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.514626 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:34.514917 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.514659 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:34.514917 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.514676 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert podName:e5f2a8a7-39d4-41d8-9ca2-1a049023a466 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:35.514662989 +0000 UTC m=+35.694151436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert") pod "ingress-canary-htgwf" (UID: "e5f2a8a7-39d4-41d8-9ca2-1a049023a466") : secret "canary-serving-cert" not found Apr 20 14:27:34.514917 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:34.514744 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls podName:ce3a0704-031d-43e8-87ac-e53039d6f376 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:35.514704468 +0000 UTC m=+35.694192937 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls") pod "dns-default-5qgnp" (UID: "ce3a0704-031d-43e8-87ac-e53039d6f376") : secret "dns-default-metrics-tls" not found Apr 20 14:27:34.554552 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.554499 2581 generic.go:358] "Generic (PLEG): container finished" podID="330b3bed-9410-44fc-9f4d-401c49180ff9" containerID="a409c201acd9b491e4081d538377ba3cc64039487cbbb1ad83ccbbabf7c5fb0b" exitCode=0 Apr 20 14:27:34.554644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:34.554570 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" event={"ID":"330b3bed-9410-44fc-9f4d-401c49180ff9","Type":"ContainerDied","Data":"a409c201acd9b491e4081d538377ba3cc64039487cbbb1ad83ccbbabf7c5fb0b"} Apr 20 14:27:35.382752 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.382670 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:27:35.382871 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.382670 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:27:35.385550 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.385531 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:27:35.385661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.385547 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:27:35.385661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.385608 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:27:35.386137 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.386114 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xmk8q\"" Apr 20 14:27:35.386388 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.386374 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xgs6p\"" Apr 20 14:27:35.521207 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.521189 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:35.521476 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.521218 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:35.521476 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:35.521314 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:35.521476 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:35.521315 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:35.521476 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:35.521363 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert podName:e5f2a8a7-39d4-41d8-9ca2-1a049023a466 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:37.521345378 +0000 UTC m=+37.700833825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert") pod "ingress-canary-htgwf" (UID: "e5f2a8a7-39d4-41d8-9ca2-1a049023a466") : secret "canary-serving-cert" not found Apr 20 14:27:35.521476 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:35.521376 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls podName:ce3a0704-031d-43e8-87ac-e53039d6f376 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:37.521370201 +0000 UTC m=+37.700858647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls") pod "dns-default-5qgnp" (UID: "ce3a0704-031d-43e8-87ac-e53039d6f376") : secret "dns-default-metrics-tls" not found Apr 20 14:27:35.558915 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.558892 2581 generic.go:358] "Generic (PLEG): container finished" podID="330b3bed-9410-44fc-9f4d-401c49180ff9" containerID="e6b103321104f4d336389dca894ace52ff3a0d580bafc615455576b77b430248" exitCode=0 Apr 20 14:27:35.558998 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:35.558937 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" event={"ID":"330b3bed-9410-44fc-9f4d-401c49180ff9","Type":"ContainerDied","Data":"e6b103321104f4d336389dca894ace52ff3a0d580bafc615455576b77b430248"} Apr 20 14:27:36.563046 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:36.563008 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" event={"ID":"330b3bed-9410-44fc-9f4d-401c49180ff9","Type":"ContainerStarted","Data":"ef4d653d6c8a29def53849a3607901da3aa8f904695ad0a33aad83d52cf53874"} Apr 20 14:27:36.592780 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:36.592705 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ddt9k" podStartSLOduration=5.468755309 podStartE2EDuration="36.592692301s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:27:02.995550455 +0000 UTC m=+3.175038901" lastFinishedPulling="2026-04-20 14:27:34.119487432 +0000 UTC m=+34.298975893" observedRunningTime="2026-04-20 14:27:36.592296497 +0000 UTC m=+36.771784965" watchObservedRunningTime="2026-04-20 14:27:36.592692301 +0000 UTC m=+36.772180769" Apr 20 14:27:37.535966 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:37.535935 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:37.536107 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:37.535976 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:37.536107 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:37.536077 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:37.536107 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:37.536089 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:37.536204 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:37.536140 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls podName:ce3a0704-031d-43e8-87ac-e53039d6f376 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:41.536123931 +0000 UTC m=+41.715612377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls") pod "dns-default-5qgnp" (UID: "ce3a0704-031d-43e8-87ac-e53039d6f376") : secret "dns-default-metrics-tls" not found Apr 20 14:27:37.536204 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:37.536155 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert podName:e5f2a8a7-39d4-41d8-9ca2-1a049023a466 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:41.536148878 +0000 UTC m=+41.715637324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert") pod "ingress-canary-htgwf" (UID: "e5f2a8a7-39d4-41d8-9ca2-1a049023a466") : secret "canary-serving-cert" not found Apr 20 14:27:41.563136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:41.563093 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:41.563136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:41.563139 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:41.563616 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:41.563234 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:41.563616 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:41.563236 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:41.563616 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:41.563285 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert podName:e5f2a8a7-39d4-41d8-9ca2-1a049023a466 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:49.563270133 +0000 UTC m=+49.742758578 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert") pod "ingress-canary-htgwf" (UID: "e5f2a8a7-39d4-41d8-9ca2-1a049023a466") : secret "canary-serving-cert" not found Apr 20 14:27:41.563616 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:41.563297 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls podName:ce3a0704-031d-43e8-87ac-e53039d6f376 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:49.563291376 +0000 UTC m=+49.742779822 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls") pod "dns-default-5qgnp" (UID: "ce3a0704-031d-43e8-87ac-e53039d6f376") : secret "dns-default-metrics-tls" not found Apr 20 14:27:42.076893 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.076865 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw"] Apr 20 14:27:42.082413 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.082399 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" Apr 20 14:27:42.087535 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.087314 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw"] Apr 20 14:27:42.087676 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.087608 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 14:27:42.087793 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.087772 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 14:27:42.087879 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.087639 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 14:27:42.087879 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.087772 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-pr7hp\"" Apr 20 14:27:42.087991 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.087883 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 14:27:42.268882 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.268835 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/21479ee1-c363-4e59-bf3f-602edfc2259a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-797d49f99c-m52nw\" (UID: \"21479ee1-c363-4e59-bf3f-602edfc2259a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" Apr 20 14:27:42.269057 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.268928 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f2gl\" (UniqueName: \"kubernetes.io/projected/21479ee1-c363-4e59-bf3f-602edfc2259a-kube-api-access-5f2gl\") pod \"managed-serviceaccount-addon-agent-797d49f99c-m52nw\" (UID: \"21479ee1-c363-4e59-bf3f-602edfc2259a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" Apr 20 14:27:42.369787 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.369678 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f2gl\" (UniqueName: \"kubernetes.io/projected/21479ee1-c363-4e59-bf3f-602edfc2259a-kube-api-access-5f2gl\") pod \"managed-serviceaccount-addon-agent-797d49f99c-m52nw\" (UID: \"21479ee1-c363-4e59-bf3f-602edfc2259a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" Apr 20 14:27:42.369914 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.369801 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/21479ee1-c363-4e59-bf3f-602edfc2259a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-797d49f99c-m52nw\" (UID: \"21479ee1-c363-4e59-bf3f-602edfc2259a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" Apr 20 14:27:42.373495 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.373473 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/21479ee1-c363-4e59-bf3f-602edfc2259a-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-797d49f99c-m52nw\" (UID: \"21479ee1-c363-4e59-bf3f-602edfc2259a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" Apr 20 14:27:42.380073 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.380051 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f2gl\" (UniqueName: \"kubernetes.io/projected/21479ee1-c363-4e59-bf3f-602edfc2259a-kube-api-access-5f2gl\") pod \"managed-serviceaccount-addon-agent-797d49f99c-m52nw\" (UID: \"21479ee1-c363-4e59-bf3f-602edfc2259a\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" Apr 20 14:27:42.406620 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.406589 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" Apr 20 14:27:42.528853 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.528819 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw"] Apr 20 14:27:42.533229 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:27:42.533201 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21479ee1_c363_4e59_bf3f_602edfc2259a.slice/crio-8262ceca0e4b49bdad3b6bf17f16d448ddfa3f2f30855aa98b6b005cc6d9416b WatchSource:0}: Error finding container 8262ceca0e4b49bdad3b6bf17f16d448ddfa3f2f30855aa98b6b005cc6d9416b: Status 404 returned error can't find the container with id 8262ceca0e4b49bdad3b6bf17f16d448ddfa3f2f30855aa98b6b005cc6d9416b Apr 20 14:27:42.573178 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:42.573150 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" event={"ID":"21479ee1-c363-4e59-bf3f-602edfc2259a","Type":"ContainerStarted","Data":"8262ceca0e4b49bdad3b6bf17f16d448ddfa3f2f30855aa98b6b005cc6d9416b"} Apr 20 14:27:46.582870 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:46.582833 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" event={"ID":"21479ee1-c363-4e59-bf3f-602edfc2259a","Type":"ContainerStarted","Data":"b89b7fe2c2562ec01894e2b0d3bd305f06d383d92a31c94c8260191dbde33125"} Apr 20 14:27:46.598267 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:46.598214 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-797d49f99c-m52nw" podStartSLOduration=1.361121801 podStartE2EDuration="4.598196382s" podCreationTimestamp="2026-04-20 14:27:42 +0000 UTC" firstStartedPulling="2026-04-20 14:27:42.535085207 +0000 UTC m=+42.714573653" lastFinishedPulling="2026-04-20 14:27:45.772159788 +0000 UTC m=+45.951648234" observedRunningTime="2026-04-20 14:27:46.597602019 +0000 UTC m=+46.777090465" watchObservedRunningTime="2026-04-20 14:27:46.598196382 +0000 UTC m=+46.777684851" Apr 20 14:27:49.616123 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:49.616088 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:27:49.616123 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:49.616124 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:27:49.616602 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:49.616223 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:49.616602 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:49.616226 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:49.616602 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:49.616273 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert podName:e5f2a8a7-39d4-41d8-9ca2-1a049023a466 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:05.616259203 +0000 UTC m=+65.795747650 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert") pod "ingress-canary-htgwf" (UID: "e5f2a8a7-39d4-41d8-9ca2-1a049023a466") : secret "canary-serving-cert" not found Apr 20 14:27:49.616602 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:27:49.616285 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls podName:ce3a0704-031d-43e8-87ac-e53039d6f376 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:05.616279923 +0000 UTC m=+65.795768369 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls") pod "dns-default-5qgnp" (UID: "ce3a0704-031d-43e8-87ac-e53039d6f376") : secret "dns-default-metrics-tls" not found Apr 20 14:27:58.580132 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:27:58.580104 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-r8xsm" Apr 20 14:28:05.626293 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:05.626263 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:28:05.626293 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:05.626298 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:28:05.626768 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:05.626401 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:28:05.626768 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:05.626413 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:28:05.626768 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:05.626462 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert podName:e5f2a8a7-39d4-41d8-9ca2-1a049023a466 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:37.626445714 +0000 UTC m=+97.805934160 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert") pod "ingress-canary-htgwf" (UID: "e5f2a8a7-39d4-41d8-9ca2-1a049023a466") : secret "canary-serving-cert" not found Apr 20 14:28:05.626768 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:05.626475 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls podName:ce3a0704-031d-43e8-87ac-e53039d6f376 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:37.626469368 +0000 UTC m=+97.805957814 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls") pod "dns-default-5qgnp" (UID: "ce3a0704-031d-43e8-87ac-e53039d6f376") : secret "dns-default-metrics-tls" not found Apr 20 14:28:06.028878 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.028846 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:28:06.029025 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.028890 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:28:06.031599 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.031576 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:28:06.031660 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.031643 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:28:06.039676 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:06.039655 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:28:06.039755 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:06.039719 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs podName:5b3c9c26-01c0-40b2-ba38-e4b72ba81f66 nodeName:}" failed. No retries permitted until 2026-04-20 14:29:10.039695701 +0000 UTC m=+130.219184148 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs") pod "network-metrics-daemon-t787k" (UID: "5b3c9c26-01c0-40b2-ba38-e4b72ba81f66") : secret "metrics-daemon-secret" not found Apr 20 14:28:06.042027 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.042010 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:28:06.053174 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.053144 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz5j5\" (UniqueName: \"kubernetes.io/projected/9ee0de80-7894-40f1-bba0-63975a8765c3-kube-api-access-zz5j5\") pod \"network-check-target-s2vgq\" (UID: \"9ee0de80-7894-40f1-bba0-63975a8765c3\") " pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:28:06.294569 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.294504 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xgs6p\"" Apr 20 14:28:06.302364 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.302344 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:28:06.428894 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.428865 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s2vgq"] Apr 20 14:28:06.431643 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:06.431611 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee0de80_7894_40f1_bba0_63975a8765c3.slice/crio-00ad047b5414d486d2c399ba678ac7c6deba1fb11168763f97d01b94a6c1231a WatchSource:0}: Error finding container 00ad047b5414d486d2c399ba678ac7c6deba1fb11168763f97d01b94a6c1231a: Status 404 returned error can't find the container with id 00ad047b5414d486d2c399ba678ac7c6deba1fb11168763f97d01b94a6c1231a Apr 20 14:28:06.622038 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:06.621959 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s2vgq" event={"ID":"9ee0de80-7894-40f1-bba0-63975a8765c3","Type":"ContainerStarted","Data":"00ad047b5414d486d2c399ba678ac7c6deba1fb11168763f97d01b94a6c1231a"} Apr 20 14:28:09.629343 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:09.629312 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s2vgq" event={"ID":"9ee0de80-7894-40f1-bba0-63975a8765c3","Type":"ContainerStarted","Data":"167e9ea665ee01137538432fab92550b085124b28e57aecf7e3df129d8418d8d"} Apr 20 14:28:09.629666 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:09.629436 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:28:09.647188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:09.647116 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-s2vgq" podStartSLOduration=66.697522415 podStartE2EDuration="1m9.647105176s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:28:06.436630326 +0000 UTC m=+66.616118780" lastFinishedPulling="2026-04-20 14:28:09.386213095 +0000 UTC m=+69.565701541" observedRunningTime="2026-04-20 14:28:09.646824499 +0000 UTC m=+69.826312970" watchObservedRunningTime="2026-04-20 14:28:09.647105176 +0000 UTC m=+69.826593644" Apr 20 14:28:22.285462 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.285426 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7748fc9578-6ldxb"] Apr 20 14:28:22.291738 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.291706 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.294275 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.294250 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 14:28:22.294587 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.294567 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 14:28:22.294701 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.294675 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 14:28:22.294783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.294682 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 14:28:22.294783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.294686 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 14:28:22.294909 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.294889 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 14:28:22.294975 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.294908 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-mlq7f\"" Apr 20 14:28:22.300750 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.300708 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7748fc9578-6ldxb"] Apr 20 14:28:22.338300 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.338276 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.338403 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.338305 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4mww\" (UniqueName: \"kubernetes.io/projected/9e88f589-217b-4f8f-a0a4-7289dd42caff-kube-api-access-g4mww\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.338403 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.338346 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.338481 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.338441 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-default-certificate\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.338481 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.338463 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-stats-auth\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.439464 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.439440 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-default-certificate\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.439568 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.439467 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-stats-auth\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.439568 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.439485 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.439568 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.439503 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4mww\" (UniqueName: \"kubernetes.io/projected/9e88f589-217b-4f8f-a0a4-7289dd42caff-kube-api-access-g4mww\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.439719 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:22.439647 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:22.939624092 +0000 UTC m=+83.119112541 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : configmap references non-existent config key: service-ca.crt Apr 20 14:28:22.439719 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.439695 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.439854 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:22.439815 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:28:22.439912 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:22.439882 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:22.9398652 +0000 UTC m=+83.119353648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : secret "router-metrics-certs-default" not found Apr 20 14:28:22.442045 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.442025 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-stats-auth\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.442130 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.442057 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-default-certificate\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.448012 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.447995 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4mww\" (UniqueName: \"kubernetes.io/projected/9e88f589-217b-4f8f-a0a4-7289dd42caff-kube-api-access-g4mww\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.581584 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.581510 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6hcxh"] Apr 20 14:28:22.584581 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.584562 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.584687 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.584669 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2hlrs"] Apr 20 14:28:22.586954 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.586931 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 14:28:22.587088 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.587054 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:28:22.587162 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.587112 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 14:28:22.587306 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.587287 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.587400 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.587294 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:28:22.587686 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.587673 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bv9pq\"" Apr 20 14:28:22.590009 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.589967 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:28:22.590178 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.590163 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 14:28:22.590440 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.590173 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 14:28:22.592534 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.590687 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 14:28:22.592534 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.590198 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-6ch9w\"" Apr 20 14:28:22.596546 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.596527 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6hcxh"] Apr 20 14:28:22.597505 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.597487 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 14:28:22.598931 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.598915 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 14:28:22.599309 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.599288 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2hlrs"] Apr 20 14:28:22.641162 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641139 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-service-ca-bundle\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.641273 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641169 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-serving-cert\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.641273 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641189 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.641273 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641211 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-trusted-ca\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.641404 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641270 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-config\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.641404 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641293 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-serving-cert\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.641404 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641309 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt49s\" (UniqueName: \"kubernetes.io/projected/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-kube-api-access-kt49s\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.641404 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641344 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqztz\" (UniqueName: \"kubernetes.io/projected/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-kube-api-access-hqztz\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.641404 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641359 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-tmp\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.641599 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.641402 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-snapshots\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.742678 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.742654 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-service-ca-bundle\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.742821 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.742681 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-serving-cert\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.742821 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.742698 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.742821 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.742719 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-trusted-ca\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.742821 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.742768 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-config\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.742821 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.742782 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-serving-cert\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.742821 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.742805 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt49s\" (UniqueName: \"kubernetes.io/projected/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-kube-api-access-kt49s\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.743097 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.742979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqztz\" (UniqueName: \"kubernetes.io/projected/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-kube-api-access-hqztz\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.743097 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.743013 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-tmp\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.743097 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.743039 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-snapshots\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.743706 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.743562 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-config\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.743706 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.743643 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-tmp\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.743706 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.743684 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-trusted-ca\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.743965 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.743770 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-snapshots\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.744045 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.744027 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-service-ca-bundle\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.744163 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.744132 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.745702 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.745683 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-serving-cert\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.745811 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.745709 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-serving-cert\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.751138 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.751104 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqztz\" (UniqueName: \"kubernetes.io/projected/8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5-kube-api-access-hqztz\") pod \"console-operator-9d4b6777b-2hlrs\" (UID: \"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5\") " pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.751238 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.751165 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt49s\" (UniqueName: \"kubernetes.io/projected/a67564e3-e8db-4d6a-a8a4-591a0e2cf642-kube-api-access-kt49s\") pod \"insights-operator-585dfdc468-6hcxh\" (UID: \"a67564e3-e8db-4d6a-a8a4-591a0e2cf642\") " pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.897317 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.897259 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-6hcxh" Apr 20 14:28:22.903107 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.903087 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:22.945340 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.945298 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.945491 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:22.945401 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:22.945533 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:22.945502 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:23.945482713 +0000 UTC m=+84.124971181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : configmap references non-existent config key: service-ca.crt Apr 20 14:28:22.945533 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:22.945510 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:28:22.945633 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:22.945586 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:23.945569903 +0000 UTC m=+84.125058355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : secret "router-metrics-certs-default" not found Apr 20 14:28:23.024054 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:23.024028 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6hcxh"] Apr 20 14:28:23.026655 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:23.026629 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67564e3_e8db_4d6a_a8a4_591a0e2cf642.slice/crio-ac555481678b027e5849f51cec7d86e39108a3877973381847615d3298fead35 WatchSource:0}: Error finding container ac555481678b027e5849f51cec7d86e39108a3877973381847615d3298fead35: Status 404 returned error can't find the container with id ac555481678b027e5849f51cec7d86e39108a3877973381847615d3298fead35 Apr 20 14:28:23.036835 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:23.036812 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2hlrs"] Apr 20 14:28:23.039957 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:23.039934 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8651cc3c_2b74_4c3e_bd18_5c4883a6e3e5.slice/crio-a845e916089d5fee5590a07f48900bc763fdd07a1caf762e1adeed288e5ec72c WatchSource:0}: Error finding container a845e916089d5fee5590a07f48900bc763fdd07a1caf762e1adeed288e5ec72c: Status 404 returned error can't find the container with id a845e916089d5fee5590a07f48900bc763fdd07a1caf762e1adeed288e5ec72c Apr 20 14:28:23.659264 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:23.659227 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6hcxh" event={"ID":"a67564e3-e8db-4d6a-a8a4-591a0e2cf642","Type":"ContainerStarted","Data":"ac555481678b027e5849f51cec7d86e39108a3877973381847615d3298fead35"} Apr 20 14:28:23.660281 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:23.660259 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" event={"ID":"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5","Type":"ContainerStarted","Data":"a845e916089d5fee5590a07f48900bc763fdd07a1caf762e1adeed288e5ec72c"} Apr 20 14:28:23.953113 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:23.953029 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:23.953265 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:23.953119 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:23.953265 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:23.953249 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:25.953231585 +0000 UTC m=+86.132720038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : configmap references non-existent config key: service-ca.crt Apr 20 14:28:23.953507 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:23.953474 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:28:23.953639 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:23.953528 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:25.953514424 +0000 UTC m=+86.133002873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : secret "router-metrics-certs-default" not found Apr 20 14:28:25.967321 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:25.967264 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:25.967756 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:25.967379 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:25.967756 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:25.967490 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:29.967466513 +0000 UTC m=+90.146954959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : configmap references non-existent config key: service-ca.crt Apr 20 14:28:25.967756 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:25.967505 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:28:25.967756 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:25.967566 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:29.967552072 +0000 UTC m=+90.147040522 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : secret "router-metrics-certs-default" not found Apr 20 14:28:26.667911 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:26.667842 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/0.log" Apr 20 14:28:26.667911 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:26.667886 2581 generic.go:358] "Generic (PLEG): container finished" podID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" containerID="383cc1da6e40fca35bc66d392b73454ed024dd14407641696b11c87f39a59c1b" exitCode=255 Apr 20 14:28:26.668113 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:26.667987 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" event={"ID":"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5","Type":"ContainerDied","Data":"383cc1da6e40fca35bc66d392b73454ed024dd14407641696b11c87f39a59c1b"} Apr 20 14:28:26.668174 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:26.668158 2581 scope.go:117] "RemoveContainer" containerID="383cc1da6e40fca35bc66d392b73454ed024dd14407641696b11c87f39a59c1b" Apr 20 14:28:26.669317 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:26.669298 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6hcxh" event={"ID":"a67564e3-e8db-4d6a-a8a4-591a0e2cf642","Type":"ContainerStarted","Data":"5a6ab794f637d54e1056a3932997e897038ea698ae3ee36e23792ba19b4ee5f7"} Apr 20 14:28:26.703115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:26.703078 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-6hcxh" podStartSLOduration=1.4864288110000001 podStartE2EDuration="4.703067754s" podCreationTimestamp="2026-04-20 14:28:22 +0000 UTC" firstStartedPulling="2026-04-20 14:28:23.028471985 +0000 UTC m=+83.207960430" lastFinishedPulling="2026-04-20 14:28:26.245110921 +0000 UTC m=+86.424599373" observedRunningTime="2026-04-20 14:28:26.702639036 +0000 UTC m=+86.882127505" watchObservedRunningTime="2026-04-20 14:28:26.703067754 +0000 UTC m=+86.882556222" Apr 20 14:28:27.027951 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.027921 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw"] Apr 20 14:28:27.030747 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.030712 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" Apr 20 14:28:27.033364 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.033343 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-lqtlt\"" Apr 20 14:28:27.033494 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.033343 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 14:28:27.033494 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.033432 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 14:28:27.041010 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.040990 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw"] Apr 20 14:28:27.075601 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.075582 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7qp6\" (UniqueName: \"kubernetes.io/projected/cca82e87-2fd5-4157-9e9f-c2ea4ea55866-kube-api-access-s7qp6\") pod \"migrator-74bb7799d9-58sfw\" (UID: \"cca82e87-2fd5-4157-9e9f-c2ea4ea55866\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" Apr 20 14:28:27.176365 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.176335 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7qp6\" (UniqueName: \"kubernetes.io/projected/cca82e87-2fd5-4157-9e9f-c2ea4ea55866-kube-api-access-s7qp6\") pod \"migrator-74bb7799d9-58sfw\" (UID: \"cca82e87-2fd5-4157-9e9f-c2ea4ea55866\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" Apr 20 14:28:27.184074 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.184047 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7qp6\" (UniqueName: \"kubernetes.io/projected/cca82e87-2fd5-4157-9e9f-c2ea4ea55866-kube-api-access-s7qp6\") pod \"migrator-74bb7799d9-58sfw\" (UID: \"cca82e87-2fd5-4157-9e9f-c2ea4ea55866\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" Apr 20 14:28:27.339597 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.339532 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" Apr 20 14:28:27.457779 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.457709 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw"] Apr 20 14:28:27.460377 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:27.460351 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcca82e87_2fd5_4157_9e9f_c2ea4ea55866.slice/crio-63662fac726f6627c28d3f0a3d06fdc09c4ac7736cf30c86203ac19fa714d68e WatchSource:0}: Error finding container 63662fac726f6627c28d3f0a3d06fdc09c4ac7736cf30c86203ac19fa714d68e: Status 404 returned error can't find the container with id 63662fac726f6627c28d3f0a3d06fdc09c4ac7736cf30c86203ac19fa714d68e Apr 20 14:28:27.673176 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.673104 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/1.log" Apr 20 14:28:27.673514 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.673498 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/0.log" Apr 20 14:28:27.673580 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.673534 2581 generic.go:358] "Generic (PLEG): container finished" podID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" containerID="79ddce295785c3e97936a91963ad768828d3057458c4f3daed93e29a0a79ac93" exitCode=255 Apr 20 14:28:27.673648 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.673595 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" event={"ID":"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5","Type":"ContainerDied","Data":"79ddce295785c3e97936a91963ad768828d3057458c4f3daed93e29a0a79ac93"} Apr 20 14:28:27.673648 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.673640 2581 scope.go:117] "RemoveContainer" containerID="383cc1da6e40fca35bc66d392b73454ed024dd14407641696b11c87f39a59c1b" Apr 20 14:28:27.674127 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.673890 2581 scope.go:117] "RemoveContainer" containerID="79ddce295785c3e97936a91963ad768828d3057458c4f3daed93e29a0a79ac93" Apr 20 14:28:27.674247 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:27.674128 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2hlrs_openshift-console-operator(8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" podUID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" Apr 20 14:28:27.674763 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:27.674717 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" event={"ID":"cca82e87-2fd5-4157-9e9f-c2ea4ea55866","Type":"ContainerStarted","Data":"63662fac726f6627c28d3f0a3d06fdc09c4ac7736cf30c86203ac19fa714d68e"} Apr 20 14:28:28.678085 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:28.678060 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/1.log" Apr 20 14:28:28.678544 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:28.678469 2581 scope.go:117] "RemoveContainer" containerID="79ddce295785c3e97936a91963ad768828d3057458c4f3daed93e29a0a79ac93" Apr 20 14:28:28.678689 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:28.678667 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2hlrs_openshift-console-operator(8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" podUID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" Apr 20 14:28:29.175489 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.175460 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ph5l8"] Apr 20 14:28:29.178290 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.178273 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.180869 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.180849 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 14:28:29.180997 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.180876 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 14:28:29.181242 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.181222 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 14:28:29.181328 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.181246 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 14:28:29.181789 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.181774 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-k84tq\"" Apr 20 14:28:29.186816 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.186793 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ph5l8"] Apr 20 14:28:29.293817 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.293791 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-signing-cabundle\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.293905 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.293884 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-signing-key\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.293961 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.293904 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkqtb\" (UniqueName: \"kubernetes.io/projected/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-kube-api-access-dkqtb\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.395011 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.394990 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-signing-cabundle\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.395100 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.395074 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-signing-key\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.395162 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.395100 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkqtb\" (UniqueName: \"kubernetes.io/projected/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-kube-api-access-dkqtb\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.395661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.395641 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-signing-cabundle\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.397534 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.397513 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-signing-key\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.403649 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.403627 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkqtb\" (UniqueName: \"kubernetes.io/projected/9d0ab68f-0d96-424c-9081-c4a2c4f27be3-kube-api-access-dkqtb\") pod \"service-ca-865cb79987-ph5l8\" (UID: \"9d0ab68f-0d96-424c-9081-c4a2c4f27be3\") " pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.487584 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.487565 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-ph5l8" Apr 20 14:28:29.600155 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.600130 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-ph5l8"] Apr 20 14:28:29.603269 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:29.603237 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d0ab68f_0d96_424c_9081_c4a2c4f27be3.slice/crio-04506557aea3445b8585ff10434b121f59e940fffef2d59fcbc4b08bbe098acb WatchSource:0}: Error finding container 04506557aea3445b8585ff10434b121f59e940fffef2d59fcbc4b08bbe098acb: Status 404 returned error can't find the container with id 04506557aea3445b8585ff10434b121f59e940fffef2d59fcbc4b08bbe098acb Apr 20 14:28:29.681329 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.681301 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ph5l8" event={"ID":"9d0ab68f-0d96-424c-9081-c4a2c4f27be3","Type":"ContainerStarted","Data":"04506557aea3445b8585ff10434b121f59e940fffef2d59fcbc4b08bbe098acb"} Apr 20 14:28:29.682712 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.682689 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" event={"ID":"cca82e87-2fd5-4157-9e9f-c2ea4ea55866","Type":"ContainerStarted","Data":"04fa7e2bc54210d01ea43089256cb74126c5503dd32720ddc1c8356522aa811e"} Apr 20 14:28:29.682812 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.682718 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" event={"ID":"cca82e87-2fd5-4157-9e9f-c2ea4ea55866","Type":"ContainerStarted","Data":"d9ff46e745c67027b16b02dcf400af28f4e0bae8a77c43225bb65ce88acbaa2d"} Apr 20 14:28:29.701671 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:29.701623 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-58sfw" podStartSLOduration=1.36530078 podStartE2EDuration="2.701610101s" podCreationTimestamp="2026-04-20 14:28:27 +0000 UTC" firstStartedPulling="2026-04-20 14:28:27.462655562 +0000 UTC m=+87.642144008" lastFinishedPulling="2026-04-20 14:28:28.798964883 +0000 UTC m=+88.978453329" observedRunningTime="2026-04-20 14:28:29.700867275 +0000 UTC m=+89.880355745" watchObservedRunningTime="2026-04-20 14:28:29.701610101 +0000 UTC m=+89.881098566" Apr 20 14:28:30.000675 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:30.000642 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:30.000855 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:30.000747 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:30.000855 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:30.000816 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:28:30.000930 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:30.000852 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:38.000834937 +0000 UTC m=+98.180323404 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : configmap references non-existent config key: service-ca.crt Apr 20 14:28:30.000930 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:30.000878 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:38.000865738 +0000 UTC m=+98.180354196 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : secret "router-metrics-certs-default" not found Apr 20 14:28:30.523890 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:30.523860 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lmkks_4828e4f0-6105-42f8-8ec4-54d66f7d101d/dns-node-resolver/0.log" Apr 20 14:28:31.689675 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:31.689582 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-ph5l8" event={"ID":"9d0ab68f-0d96-424c-9081-c4a2c4f27be3","Type":"ContainerStarted","Data":"ef43fcd9780e099fe33a09d3e281bb0a8e26eedce0a6c217591ec0e202ddb7f5"} Apr 20 14:28:31.707962 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:31.707919 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-ph5l8" podStartSLOduration=0.877275577 podStartE2EDuration="2.707905671s" podCreationTimestamp="2026-04-20 14:28:29 +0000 UTC" firstStartedPulling="2026-04-20 14:28:29.605023893 +0000 UTC m=+89.784512339" lastFinishedPulling="2026-04-20 14:28:31.435653983 +0000 UTC m=+91.615142433" observedRunningTime="2026-04-20 14:28:31.70697758 +0000 UTC m=+91.886466051" watchObservedRunningTime="2026-04-20 14:28:31.707905671 +0000 UTC m=+91.887394139" Apr 20 14:28:31.723757 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:31.723716 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q5vqc_9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c/node-ca/0.log" Apr 20 14:28:32.724015 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:32.723983 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-58sfw_cca82e87-2fd5-4157-9e9f-c2ea4ea55866/migrator/0.log" Apr 20 14:28:32.903337 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:32.903313 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:32.903505 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:32.903395 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:32.903682 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:32.903667 2581 scope.go:117] "RemoveContainer" containerID="79ddce295785c3e97936a91963ad768828d3057458c4f3daed93e29a0a79ac93" Apr 20 14:28:32.903882 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:32.903863 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2hlrs_openshift-console-operator(8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" podUID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" Apr 20 14:28:32.923476 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:32.923444 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-58sfw_cca82e87-2fd5-4157-9e9f-c2ea4ea55866/graceful-termination/0.log" Apr 20 14:28:33.695868 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:33.695835 2581 scope.go:117] "RemoveContainer" containerID="79ddce295785c3e97936a91963ad768828d3057458c4f3daed93e29a0a79ac93" Apr 20 14:28:33.696061 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:33.696040 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2hlrs_openshift-console-operator(8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" podUID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" Apr 20 14:28:37.653470 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:37.653432 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:28:37.653907 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:37.653481 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:28:37.656169 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:37.656143 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e5f2a8a7-39d4-41d8-9ca2-1a049023a466-cert\") pod \"ingress-canary-htgwf\" (UID: \"e5f2a8a7-39d4-41d8-9ca2-1a049023a466\") " pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:28:37.656278 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:37.656267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce3a0704-031d-43e8-87ac-e53039d6f376-metrics-tls\") pod \"dns-default-5qgnp\" (UID: \"ce3a0704-031d-43e8-87ac-e53039d6f376\") " pod="openshift-dns/dns-default-5qgnp" Apr 20 14:28:37.943331 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:37.943254 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5k2mg\"" Apr 20 14:28:37.950645 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:37.950622 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-79jcr\"" Apr 20 14:28:37.950787 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:37.950773 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5qgnp" Apr 20 14:28:37.958413 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:37.958388 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-htgwf" Apr 20 14:28:38.057254 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:38.057229 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:38.057372 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:38.057307 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:38.057867 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:38.057452 2581 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:28:38.057867 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:38.057461 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:54.057441138 +0000 UTC m=+114.236929584 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : configmap references non-existent config key: service-ca.crt Apr 20 14:28:38.057867 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:38.057543 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs podName:9e88f589-217b-4f8f-a0a4-7289dd42caff nodeName:}" failed. No retries permitted until 2026-04-20 14:28:54.05751617 +0000 UTC m=+114.237004631 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs") pod "router-default-7748fc9578-6ldxb" (UID: "9e88f589-217b-4f8f-a0a4-7289dd42caff") : secret "router-metrics-certs-default" not found Apr 20 14:28:38.083380 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:38.083323 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5qgnp"] Apr 20 14:28:38.087685 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:38.087658 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce3a0704_031d_43e8_87ac_e53039d6f376.slice/crio-a68cdeaefd0b68243e2c45a1479c47b7a18c364be47886354e66264902ef51de WatchSource:0}: Error finding container a68cdeaefd0b68243e2c45a1479c47b7a18c364be47886354e66264902ef51de: Status 404 returned error can't find the container with id a68cdeaefd0b68243e2c45a1479c47b7a18c364be47886354e66264902ef51de Apr 20 14:28:38.098881 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:38.098859 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-htgwf"] Apr 20 14:28:38.101503 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:38.101477 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5f2a8a7_39d4_41d8_9ca2_1a049023a466.slice/crio-682f605cac1592e7545913a56a2bc1c183ab85fec59c250349d0a959fc0c5d75 WatchSource:0}: Error finding container 682f605cac1592e7545913a56a2bc1c183ab85fec59c250349d0a959fc0c5d75: Status 404 returned error can't find the container with id 682f605cac1592e7545913a56a2bc1c183ab85fec59c250349d0a959fc0c5d75 Apr 20 14:28:38.714187 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:38.714153 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-htgwf" event={"ID":"e5f2a8a7-39d4-41d8-9ca2-1a049023a466","Type":"ContainerStarted","Data":"682f605cac1592e7545913a56a2bc1c183ab85fec59c250349d0a959fc0c5d75"} Apr 20 14:28:38.715168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:38.715137 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5qgnp" event={"ID":"ce3a0704-031d-43e8-87ac-e53039d6f376","Type":"ContainerStarted","Data":"a68cdeaefd0b68243e2c45a1479c47b7a18c364be47886354e66264902ef51de"} Apr 20 14:28:40.634239 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:40.634209 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-s2vgq" Apr 20 14:28:40.720458 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:40.720425 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-htgwf" event={"ID":"e5f2a8a7-39d4-41d8-9ca2-1a049023a466","Type":"ContainerStarted","Data":"3cbe2a9ff7b7699608518a8c107bc15b6224c29bf6c4fa2b68125b42d5e82186"} Apr 20 14:28:40.722046 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:40.722018 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5qgnp" event={"ID":"ce3a0704-031d-43e8-87ac-e53039d6f376","Type":"ContainerStarted","Data":"65349ff47f75768e1796813c3915cba2759abd36b66d3b8592b43f9614f3b3ac"} Apr 20 14:28:40.722046 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:40.722047 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5qgnp" event={"ID":"ce3a0704-031d-43e8-87ac-e53039d6f376","Type":"ContainerStarted","Data":"313e281c4026f3d35e695fb338e5c5186c97c29baf6c6e9e0b94f892f68b60fa"} Apr 20 14:28:40.722194 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:40.722106 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5qgnp" Apr 20 14:28:40.743583 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:40.743533 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-htgwf" podStartSLOduration=65.421664931 podStartE2EDuration="1m7.743521048s" podCreationTimestamp="2026-04-20 14:27:33 +0000 UTC" firstStartedPulling="2026-04-20 14:28:38.10365654 +0000 UTC m=+98.283144992" lastFinishedPulling="2026-04-20 14:28:40.425512654 +0000 UTC m=+100.605001109" observedRunningTime="2026-04-20 14:28:40.742377951 +0000 UTC m=+100.921866418" watchObservedRunningTime="2026-04-20 14:28:40.743521048 +0000 UTC m=+100.923009516" Apr 20 14:28:40.759353 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:40.759317 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5qgnp" podStartSLOduration=65.421979432 podStartE2EDuration="1m7.759306557s" podCreationTimestamp="2026-04-20 14:27:33 +0000 UTC" firstStartedPulling="2026-04-20 14:28:38.089594042 +0000 UTC m=+98.269082488" lastFinishedPulling="2026-04-20 14:28:40.426921165 +0000 UTC m=+100.606409613" observedRunningTime="2026-04-20 14:28:40.759052817 +0000 UTC m=+100.938541297" watchObservedRunningTime="2026-04-20 14:28:40.759306557 +0000 UTC m=+100.938795024" Apr 20 14:28:47.383197 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:47.383164 2581 scope.go:117] "RemoveContainer" containerID="79ddce295785c3e97936a91963ad768828d3057458c4f3daed93e29a0a79ac93" Apr 20 14:28:47.743409 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:47.743380 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:28:47.743746 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:47.743705 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/1.log" Apr 20 14:28:47.743842 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:47.743769 2581 generic.go:358] "Generic (PLEG): container finished" podID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" containerID="982c328deb2c04d552efe58124307cda68e9881638a6c650debcfe9fc8001070" exitCode=255 Apr 20 14:28:47.743842 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:47.743820 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" event={"ID":"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5","Type":"ContainerDied","Data":"982c328deb2c04d552efe58124307cda68e9881638a6c650debcfe9fc8001070"} Apr 20 14:28:47.743949 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:47.743862 2581 scope.go:117] "RemoveContainer" containerID="79ddce295785c3e97936a91963ad768828d3057458c4f3daed93e29a0a79ac93" Apr 20 14:28:47.744245 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:47.744226 2581 scope.go:117] "RemoveContainer" containerID="982c328deb2c04d552efe58124307cda68e9881638a6c650debcfe9fc8001070" Apr 20 14:28:47.744424 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:47.744408 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2hlrs_openshift-console-operator(8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" podUID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" Apr 20 14:28:48.747747 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:48.747693 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:28:50.726353 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:50.726321 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5qgnp" Apr 20 14:28:51.675236 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.675209 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-v88gx"] Apr 20 14:28:51.679469 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.679445 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.690264 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.690247 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:28:51.690264 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.690255 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:28:51.690391 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.690255 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rq5c2\"" Apr 20 14:28:51.721938 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.721915 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v88gx"] Apr 20 14:28:51.755583 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.755557 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c07adcb0-9da3-4948-b0cc-5a16508f6526-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.755941 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.755594 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c07adcb0-9da3-4948-b0cc-5a16508f6526-crio-socket\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.755941 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.755622 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c07adcb0-9da3-4948-b0cc-5a16508f6526-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.755941 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.755670 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr5gb\" (UniqueName: \"kubernetes.io/projected/c07adcb0-9da3-4948-b0cc-5a16508f6526-kube-api-access-wr5gb\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.755941 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.755699 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c07adcb0-9da3-4948-b0cc-5a16508f6526-data-volume\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.792548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.792526 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d997f6dc7-kx4pp"] Apr 20 14:28:51.795480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.795464 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.797906 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.797887 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 14:28:51.798018 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.798004 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 14:28:51.798018 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.798011 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bsqq7\"" Apr 20 14:28:51.798110 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.798020 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 14:28:51.803479 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.803460 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 14:28:51.810553 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.810536 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d997f6dc7-kx4pp"] Apr 20 14:28:51.856380 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.856352 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c07adcb0-9da3-4948-b0cc-5a16508f6526-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.856510 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.856395 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c07adcb0-9da3-4948-b0cc-5a16508f6526-crio-socket\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.856510 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.856471 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c07adcb0-9da3-4948-b0cc-5a16508f6526-crio-socket\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.856602 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.856509 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c07adcb0-9da3-4948-b0cc-5a16508f6526-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.856602 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.856546 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr5gb\" (UniqueName: \"kubernetes.io/projected/c07adcb0-9da3-4948-b0cc-5a16508f6526-kube-api-access-wr5gb\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.856602 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.856573 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c07adcb0-9da3-4948-b0cc-5a16508f6526-data-volume\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.856894 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.856874 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c07adcb0-9da3-4948-b0cc-5a16508f6526-data-volume\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.857035 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.857020 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c07adcb0-9da3-4948-b0cc-5a16508f6526-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.858698 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.858673 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c07adcb0-9da3-4948-b0cc-5a16508f6526-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.865586 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.865569 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr5gb\" (UniqueName: \"kubernetes.io/projected/c07adcb0-9da3-4948-b0cc-5a16508f6526-kube-api-access-wr5gb\") pod \"insights-runtime-extractor-v88gx\" (UID: \"c07adcb0-9da3-4948-b0cc-5a16508f6526\") " pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:51.957264 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.957211 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a768c89-d5ac-4506-9349-6e193e93c0e4-registry-certificates\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.957352 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.957274 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5a768c89-d5ac-4506-9349-6e193e93c0e4-image-registry-private-configuration\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.957352 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.957300 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a768c89-d5ac-4506-9349-6e193e93c0e4-ca-trust-extracted\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.957352 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.957322 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a768c89-d5ac-4506-9349-6e193e93c0e4-trusted-ca\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.957352 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.957337 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-bound-sa-token\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.957485 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.957379 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqllh\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-kube-api-access-jqllh\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.957485 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.957413 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a768c89-d5ac-4506-9349-6e193e93c0e4-installation-pull-secrets\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.957485 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.957439 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-registry-tls\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:51.988673 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:51.988646 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-v88gx" Apr 20 14:28:52.058369 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058326 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a768c89-d5ac-4506-9349-6e193e93c0e4-registry-certificates\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.058521 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058455 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5a768c89-d5ac-4506-9349-6e193e93c0e4-image-registry-private-configuration\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.058521 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058489 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a768c89-d5ac-4506-9349-6e193e93c0e4-ca-trust-extracted\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.058640 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058538 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a768c89-d5ac-4506-9349-6e193e93c0e4-trusted-ca\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.058640 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058563 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-bound-sa-token\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.058640 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058592 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqllh\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-kube-api-access-jqllh\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.058812 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058652 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a768c89-d5ac-4506-9349-6e193e93c0e4-installation-pull-secrets\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.058812 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058684 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-registry-tls\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.058917 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.058843 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5a768c89-d5ac-4506-9349-6e193e93c0e4-ca-trust-extracted\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.059375 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.059195 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5a768c89-d5ac-4506-9349-6e193e93c0e4-registry-certificates\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.059626 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.059605 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5a768c89-d5ac-4506-9349-6e193e93c0e4-trusted-ca\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.066556 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.061563 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-registry-tls\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.066556 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.061927 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5a768c89-d5ac-4506-9349-6e193e93c0e4-installation-pull-secrets\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.066556 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.062198 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5a768c89-d5ac-4506-9349-6e193e93c0e4-image-registry-private-configuration\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.067363 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.067314 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-bound-sa-token\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.068475 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.068452 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqllh\" (UniqueName: \"kubernetes.io/projected/5a768c89-d5ac-4506-9349-6e193e93c0e4-kube-api-access-jqllh\") pod \"image-registry-6d997f6dc7-kx4pp\" (UID: \"5a768c89-d5ac-4506-9349-6e193e93c0e4\") " pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.105122 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.105098 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.110353 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.110330 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-v88gx"] Apr 20 14:28:52.114403 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:52.114373 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc07adcb0_9da3_4948_b0cc_5a16508f6526.slice/crio-8b1658ff35f3b47299bb345e76db2c3847b4ec31980aeca2479b04ba41978747 WatchSource:0}: Error finding container 8b1658ff35f3b47299bb345e76db2c3847b4ec31980aeca2479b04ba41978747: Status 404 returned error can't find the container with id 8b1658ff35f3b47299bb345e76db2c3847b4ec31980aeca2479b04ba41978747 Apr 20 14:28:52.227905 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.227878 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d997f6dc7-kx4pp"] Apr 20 14:28:52.231043 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:52.231011 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a768c89_d5ac_4506_9349_6e193e93c0e4.slice/crio-7b35fe2567cacf368074b3893b50f8cb867c4dfa2678283683c1184be7249390 WatchSource:0}: Error finding container 7b35fe2567cacf368074b3893b50f8cb867c4dfa2678283683c1184be7249390: Status 404 returned error can't find the container with id 7b35fe2567cacf368074b3893b50f8cb867c4dfa2678283683c1184be7249390 Apr 20 14:28:52.758265 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.758233 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" event={"ID":"5a768c89-d5ac-4506-9349-6e193e93c0e4","Type":"ContainerStarted","Data":"64e994823df439718c9aa58809a5ac4398e058509851dbef4861213e504eaaba"} Apr 20 14:28:52.758714 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.758273 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" event={"ID":"5a768c89-d5ac-4506-9349-6e193e93c0e4","Type":"ContainerStarted","Data":"7b35fe2567cacf368074b3893b50f8cb867c4dfa2678283683c1184be7249390"} Apr 20 14:28:52.758714 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.758310 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:28:52.759546 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.759516 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v88gx" event={"ID":"c07adcb0-9da3-4948-b0cc-5a16508f6526","Type":"ContainerStarted","Data":"e2de5314f372d99e689662a29f3d68e94fa6bbc21d9756d17151decc2480b461"} Apr 20 14:28:52.759644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.759550 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v88gx" event={"ID":"c07adcb0-9da3-4948-b0cc-5a16508f6526","Type":"ContainerStarted","Data":"8b1658ff35f3b47299bb345e76db2c3847b4ec31980aeca2479b04ba41978747"} Apr 20 14:28:52.779665 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.779618 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" podStartSLOduration=1.779607081 podStartE2EDuration="1.779607081s" podCreationTimestamp="2026-04-20 14:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:28:52.778960518 +0000 UTC m=+112.958448987" watchObservedRunningTime="2026-04-20 14:28:52.779607081 +0000 UTC m=+112.959095543" Apr 20 14:28:52.903882 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.903860 2581 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:52.904002 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.903894 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:28:52.904223 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:52.904208 2581 scope.go:117] "RemoveContainer" containerID="982c328deb2c04d552efe58124307cda68e9881638a6c650debcfe9fc8001070" Apr 20 14:28:52.904394 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:28:52.904376 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2hlrs_openshift-console-operator(8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" podUID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" Apr 20 14:28:53.763851 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:53.763811 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v88gx" event={"ID":"c07adcb0-9da3-4948-b0cc-5a16508f6526","Type":"ContainerStarted","Data":"439595541b36c1f0ff0623c92cc09c7bdaef614b96c39df44f06da5c011495ac"} Apr 20 14:28:54.073786 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.073687 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:54.073786 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.073751 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:54.074445 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.074417 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e88f589-217b-4f8f-a0a4-7289dd42caff-service-ca-bundle\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:54.076537 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.076512 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e88f589-217b-4f8f-a0a4-7289dd42caff-metrics-certs\") pod \"router-default-7748fc9578-6ldxb\" (UID: \"9e88f589-217b-4f8f-a0a4-7289dd42caff\") " pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:54.102308 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.102286 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:54.243977 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.243945 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7748fc9578-6ldxb"] Apr 20 14:28:54.578400 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:28:54.578357 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e88f589_217b_4f8f_a0a4_7289dd42caff.slice/crio-120b28d0f5801ebd9f4a6183bda1461a840802abd5a93980c1b4145e3aff8c59 WatchSource:0}: Error finding container 120b28d0f5801ebd9f4a6183bda1461a840802abd5a93980c1b4145e3aff8c59: Status 404 returned error can't find the container with id 120b28d0f5801ebd9f4a6183bda1461a840802abd5a93980c1b4145e3aff8c59 Apr 20 14:28:54.767614 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.767576 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7748fc9578-6ldxb" event={"ID":"9e88f589-217b-4f8f-a0a4-7289dd42caff","Type":"ContainerStarted","Data":"9817a2ffd6005c363240b9fd4041e26a6ef65e89dbfe306a4c69b5deb5602168"} Apr 20 14:28:54.767614 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.767617 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7748fc9578-6ldxb" event={"ID":"9e88f589-217b-4f8f-a0a4-7289dd42caff","Type":"ContainerStarted","Data":"120b28d0f5801ebd9f4a6183bda1461a840802abd5a93980c1b4145e3aff8c59"} Apr 20 14:28:54.769468 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.769434 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-v88gx" event={"ID":"c07adcb0-9da3-4948-b0cc-5a16508f6526","Type":"ContainerStarted","Data":"3b17ecaae601dbfbe7fa5bacbb18886a6062c7f98ae586a42b6827ee9d209579"} Apr 20 14:28:54.838617 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.838530 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7748fc9578-6ldxb" podStartSLOduration=32.838515695 podStartE2EDuration="32.838515695s" podCreationTimestamp="2026-04-20 14:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:28:54.837452735 +0000 UTC m=+115.016941202" watchObservedRunningTime="2026-04-20 14:28:54.838515695 +0000 UTC m=+115.018004162" Apr 20 14:28:54.865221 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:54.865179 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-v88gx" podStartSLOduration=1.414570373 podStartE2EDuration="3.865165044s" podCreationTimestamp="2026-04-20 14:28:51 +0000 UTC" firstStartedPulling="2026-04-20 14:28:52.176366547 +0000 UTC m=+112.355855001" lastFinishedPulling="2026-04-20 14:28:54.626961213 +0000 UTC m=+114.806449672" observedRunningTime="2026-04-20 14:28:54.864907886 +0000 UTC m=+115.044396357" watchObservedRunningTime="2026-04-20 14:28:54.865165044 +0000 UTC m=+115.044653512" Apr 20 14:28:55.102672 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:55.102582 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:55.105188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:55.105167 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:55.772671 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:55.772631 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:28:55.774051 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:28:55.774024 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7748fc9578-6ldxb" Apr 20 14:29:04.383187 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.383155 2581 scope.go:117] "RemoveContainer" containerID="982c328deb2c04d552efe58124307cda68e9881638a6c650debcfe9fc8001070" Apr 20 14:29:04.383762 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:29:04.383374 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2hlrs_openshift-console-operator(8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" podUID="8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5" Apr 20 14:29:04.644175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.644100 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-l65ch"] Apr 20 14:29:04.659399 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.659369 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.661815 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.661717 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:29:04.661943 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.661793 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:29:04.661943 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.661870 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:29:04.661943 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.661878 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:29:04.662087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.661941 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:29:04.662649 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.662632 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:29:04.662753 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.662638 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vhd9x\"" Apr 20 14:29:04.750229 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750200 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9682a026-b717-4f08-a6a3-d0eea65a227e-metrics-client-ca\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.750332 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750239 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-sys\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.750332 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750268 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.750332 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750322 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6bc\" (UniqueName: \"kubernetes.io/projected/9682a026-b717-4f08-a6a3-d0eea65a227e-kube-api-access-7x6bc\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.750445 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750373 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-root\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.750445 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750394 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-accelerators-collector-config\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.750523 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750442 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-wtmp\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.750523 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750466 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-textfile\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.750523 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.750485 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-tls\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.850983 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.850960 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-textfile\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851080 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.850988 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-tls\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851080 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851014 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9682a026-b717-4f08-a6a3-d0eea65a227e-metrics-client-ca\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851080 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851036 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-sys\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851080 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851062 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851250 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851115 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-sys\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851250 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851169 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6bc\" (UniqueName: \"kubernetes.io/projected/9682a026-b717-4f08-a6a3-d0eea65a227e-kube-api-access-7x6bc\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851250 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851231 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-root\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851394 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851260 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-accelerators-collector-config\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851394 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851281 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-root\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851394 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851296 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-wtmp\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851543 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851378 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-textfile\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851543 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851439 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-wtmp\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851685 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851665 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9682a026-b717-4f08-a6a3-d0eea65a227e-metrics-client-ca\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.851782 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.851760 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-accelerators-collector-config\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.853458 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.853438 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.853573 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.853559 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9682a026-b717-4f08-a6a3-d0eea65a227e-node-exporter-tls\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.858778 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.858759 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6bc\" (UniqueName: \"kubernetes.io/projected/9682a026-b717-4f08-a6a3-d0eea65a227e-kube-api-access-7x6bc\") pod \"node-exporter-l65ch\" (UID: \"9682a026-b717-4f08-a6a3-d0eea65a227e\") " pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.970616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:04.970587 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-l65ch" Apr 20 14:29:04.978514 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:29:04.978492 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9682a026_b717_4f08_a6a3_d0eea65a227e.slice/crio-7d46fa6d6a78138675a7490e0af7ad5852de313047ba45530504727c5842e8ba WatchSource:0}: Error finding container 7d46fa6d6a78138675a7490e0af7ad5852de313047ba45530504727c5842e8ba: Status 404 returned error can't find the container with id 7d46fa6d6a78138675a7490e0af7ad5852de313047ba45530504727c5842e8ba Apr 20 14:29:05.800642 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:05.800617 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l65ch" event={"ID":"9682a026-b717-4f08-a6a3-d0eea65a227e","Type":"ContainerStarted","Data":"b3383e547e7f791ba5fb87d4cee36cca2591c2b85a8509cbca12fddd786e0a0c"} Apr 20 14:29:05.800947 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:05.800653 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l65ch" event={"ID":"9682a026-b717-4f08-a6a3-d0eea65a227e","Type":"ContainerStarted","Data":"7d46fa6d6a78138675a7490e0af7ad5852de313047ba45530504727c5842e8ba"} Apr 20 14:29:06.806719 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:06.806685 2581 generic.go:358] "Generic (PLEG): container finished" podID="9682a026-b717-4f08-a6a3-d0eea65a227e" containerID="b3383e547e7f791ba5fb87d4cee36cca2591c2b85a8509cbca12fddd786e0a0c" exitCode=0 Apr 20 14:29:06.807119 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:06.806742 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l65ch" event={"ID":"9682a026-b717-4f08-a6a3-d0eea65a227e","Type":"ContainerDied","Data":"b3383e547e7f791ba5fb87d4cee36cca2591c2b85a8509cbca12fddd786e0a0c"} Apr 20 14:29:07.811486 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:07.811455 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l65ch" event={"ID":"9682a026-b717-4f08-a6a3-d0eea65a227e","Type":"ContainerStarted","Data":"b62dc6cda0d8d97c48cfefaf42e98b7b727038156da4c23dc5a54068b459df88"} Apr 20 14:29:07.811933 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:07.811498 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-l65ch" event={"ID":"9682a026-b717-4f08-a6a3-d0eea65a227e","Type":"ContainerStarted","Data":"838147b974c9a9e5a994b007942e37406d7dcc2208738079e2acae070be91747"} Apr 20 14:29:07.836188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:07.836144 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-l65ch" podStartSLOduration=3.094274603 podStartE2EDuration="3.836130666s" podCreationTimestamp="2026-04-20 14:29:04 +0000 UTC" firstStartedPulling="2026-04-20 14:29:04.98033451 +0000 UTC m=+125.159822955" lastFinishedPulling="2026-04-20 14:29:05.722190558 +0000 UTC m=+125.901679018" observedRunningTime="2026-04-20 14:29:07.834383895 +0000 UTC m=+128.013872365" watchObservedRunningTime="2026-04-20 14:29:07.836130666 +0000 UTC m=+128.015619205" Apr 20 14:29:09.429475 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.429445 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4x946"] Apr 20 14:29:09.431370 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.431355 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" Apr 20 14:29:09.433839 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.433819 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-gzbx2\"" Apr 20 14:29:09.433931 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.433821 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 14:29:09.439541 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.439512 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4x946"] Apr 20 14:29:09.589259 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.589229 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9bd2c509-8ac1-41be-a6d6-6741cf3e3491-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4x946\" (UID: \"9bd2c509-8ac1-41be-a6d6-6741cf3e3491\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" Apr 20 14:29:09.689793 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.689710 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9bd2c509-8ac1-41be-a6d6-6741cf3e3491-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4x946\" (UID: \"9bd2c509-8ac1-41be-a6d6-6741cf3e3491\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" Apr 20 14:29:09.692233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.692213 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9bd2c509-8ac1-41be-a6d6-6741cf3e3491-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4x946\" (UID: \"9bd2c509-8ac1-41be-a6d6-6741cf3e3491\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" Apr 20 14:29:09.740423 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.740398 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" Apr 20 14:29:09.855031 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:09.855004 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4x946"] Apr 20 14:29:09.857841 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:29:09.857807 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd2c509_8ac1_41be_a6d6_6741cf3e3491.slice/crio-d92d14f3bceb2262a706394933c5866e32b4d10052b6bde1fe5a4a08275011c2 WatchSource:0}: Error finding container d92d14f3bceb2262a706394933c5866e32b4d10052b6bde1fe5a4a08275011c2: Status 404 returned error can't find the container with id d92d14f3bceb2262a706394933c5866e32b4d10052b6bde1fe5a4a08275011c2 Apr 20 14:29:10.093570 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:10.093543 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:29:10.095930 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:10.095907 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b3c9c26-01c0-40b2-ba38-e4b72ba81f66-metrics-certs\") pod \"network-metrics-daemon-t787k\" (UID: \"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66\") " pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:29:10.199204 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:10.199170 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xmk8q\"" Apr 20 14:29:10.207138 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:10.207118 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t787k" Apr 20 14:29:10.328540 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:10.328493 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t787k"] Apr 20 14:29:10.334229 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:29:10.334190 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b3c9c26_01c0_40b2_ba38_e4b72ba81f66.slice/crio-544f580597c3435ba75bc4600cf1287849349e90e0036fa76ccd9d39fe361540 WatchSource:0}: Error finding container 544f580597c3435ba75bc4600cf1287849349e90e0036fa76ccd9d39fe361540: Status 404 returned error can't find the container with id 544f580597c3435ba75bc4600cf1287849349e90e0036fa76ccd9d39fe361540 Apr 20 14:29:10.821274 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:10.821235 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t787k" event={"ID":"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66","Type":"ContainerStarted","Data":"544f580597c3435ba75bc4600cf1287849349e90e0036fa76ccd9d39fe361540"} Apr 20 14:29:10.822321 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:10.822293 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" event={"ID":"9bd2c509-8ac1-41be-a6d6-6741cf3e3491","Type":"ContainerStarted","Data":"d92d14f3bceb2262a706394933c5866e32b4d10052b6bde1fe5a4a08275011c2"} Apr 20 14:29:11.826266 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:11.826235 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t787k" event={"ID":"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66","Type":"ContainerStarted","Data":"d98726aed590b389ee64952860664e0788a124b278ffc52d443f9a863f657da0"} Apr 20 14:29:11.827453 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:11.827428 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" event={"ID":"9bd2c509-8ac1-41be-a6d6-6741cf3e3491","Type":"ContainerStarted","Data":"4a7b7625d7535bf925e9b2671e3ac95e50926bdba2f9bc0c21d2ef6975550774"} Apr 20 14:29:11.827674 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:11.827656 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" Apr 20 14:29:11.832246 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:11.832225 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" Apr 20 14:29:11.843172 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:11.843136 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4x946" podStartSLOduration=1.074346639 podStartE2EDuration="2.843119832s" podCreationTimestamp="2026-04-20 14:29:09 +0000 UTC" firstStartedPulling="2026-04-20 14:29:09.859699244 +0000 UTC m=+130.039187689" lastFinishedPulling="2026-04-20 14:29:11.628472436 +0000 UTC m=+131.807960882" observedRunningTime="2026-04-20 14:29:11.841822505 +0000 UTC m=+132.021310972" watchObservedRunningTime="2026-04-20 14:29:11.843119832 +0000 UTC m=+132.022608365" Apr 20 14:29:12.832139 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:12.832095 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t787k" event={"ID":"5b3c9c26-01c0-40b2-ba38-e4b72ba81f66","Type":"ContainerStarted","Data":"84470ea383fb74499bb9d93c130ac089d0d61d3f51437a32d4b9a613e701e53e"} Apr 20 14:29:12.854706 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:12.854653 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-t787k" podStartSLOduration=131.566167218 podStartE2EDuration="2m12.854638442s" podCreationTimestamp="2026-04-20 14:27:00 +0000 UTC" firstStartedPulling="2026-04-20 14:29:10.338296911 +0000 UTC m=+130.517785357" lastFinishedPulling="2026-04-20 14:29:11.626768134 +0000 UTC m=+131.806256581" observedRunningTime="2026-04-20 14:29:12.853593437 +0000 UTC m=+133.033081905" watchObservedRunningTime="2026-04-20 14:29:12.854638442 +0000 UTC m=+133.034126950" Apr 20 14:29:13.768457 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:13.768427 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d997f6dc7-kx4pp" Apr 20 14:29:18.383644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:18.383612 2581 scope.go:117] "RemoveContainer" containerID="982c328deb2c04d552efe58124307cda68e9881638a6c650debcfe9fc8001070" Apr 20 14:29:18.850685 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:18.850658 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:29:18.850859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:18.850714 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" event={"ID":"8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5","Type":"ContainerStarted","Data":"28f52717cf743d2e2878df9bda4f4796bc529826ae2faf1d7e077f76388ba1b4"} Apr 20 14:29:18.851012 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:18.850988 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:29:18.868440 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:18.868414 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" Apr 20 14:29:18.869027 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:18.868962 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2hlrs" podStartSLOduration=53.666742191 podStartE2EDuration="56.868926593s" podCreationTimestamp="2026-04-20 14:28:22 +0000 UTC" firstStartedPulling="2026-04-20 14:28:23.041644992 +0000 UTC m=+83.221133437" lastFinishedPulling="2026-04-20 14:28:26.24382939 +0000 UTC m=+86.423317839" observedRunningTime="2026-04-20 14:29:18.867117932 +0000 UTC m=+139.046606425" watchObservedRunningTime="2026-04-20 14:29:18.868926593 +0000 UTC m=+139.048415061" Apr 20 14:29:19.047931 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.047896 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rqnwg"] Apr 20 14:29:19.050875 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.050859 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rqnwg" Apr 20 14:29:19.053994 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.053675 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 14:29:19.053994 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.053712 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 14:29:19.053994 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.053951 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-xhgc4\"" Apr 20 14:29:19.061487 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.061466 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rqnwg"] Apr 20 14:29:19.157943 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.157886 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bjpm\" (UniqueName: \"kubernetes.io/projected/274e2639-2488-4c0c-a9f2-129da2913092-kube-api-access-2bjpm\") pod \"downloads-6bcc868b7-rqnwg\" (UID: \"274e2639-2488-4c0c-a9f2-129da2913092\") " pod="openshift-console/downloads-6bcc868b7-rqnwg" Apr 20 14:29:19.258970 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.258942 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bjpm\" (UniqueName: \"kubernetes.io/projected/274e2639-2488-4c0c-a9f2-129da2913092-kube-api-access-2bjpm\") pod \"downloads-6bcc868b7-rqnwg\" (UID: \"274e2639-2488-4c0c-a9f2-129da2913092\") " pod="openshift-console/downloads-6bcc868b7-rqnwg" Apr 20 14:29:19.268582 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.268561 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bjpm\" (UniqueName: \"kubernetes.io/projected/274e2639-2488-4c0c-a9f2-129da2913092-kube-api-access-2bjpm\") pod \"downloads-6bcc868b7-rqnwg\" (UID: \"274e2639-2488-4c0c-a9f2-129da2913092\") " pod="openshift-console/downloads-6bcc868b7-rqnwg" Apr 20 14:29:19.363526 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.363499 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rqnwg" Apr 20 14:29:19.487350 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.487199 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rqnwg"] Apr 20 14:29:19.489883 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:29:19.489846 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274e2639_2488_4c0c_a9f2_129da2913092.slice/crio-e2cdb4581ca32904f93bd2a58830c40261efe09282e788caf8a8605f3435daec WatchSource:0}: Error finding container e2cdb4581ca32904f93bd2a58830c40261efe09282e788caf8a8605f3435daec: Status 404 returned error can't find the container with id e2cdb4581ca32904f93bd2a58830c40261efe09282e788caf8a8605f3435daec Apr 20 14:29:19.855933 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:19.855894 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rqnwg" event={"ID":"274e2639-2488-4c0c-a9f2-129da2913092","Type":"ContainerStarted","Data":"e2cdb4581ca32904f93bd2a58830c40261efe09282e788caf8a8605f3435daec"} Apr 20 14:29:28.181449 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.181407 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bccdb78fc-wlsht"] Apr 20 14:29:28.186338 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.186319 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.188884 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.188843 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 14:29:28.189836 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.189816 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 14:29:28.189836 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.189822 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 14:29:28.189836 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.189822 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 14:29:28.190101 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.189870 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 14:29:28.190101 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.189821 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-px8w5\"" Apr 20 14:29:28.199360 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.199340 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bccdb78fc-wlsht"] Apr 20 14:29:28.224492 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.224457 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-oauth-serving-cert\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.224605 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.224513 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mchqp\" (UniqueName: \"kubernetes.io/projected/2ec71445-b718-44da-b784-516ee20294b6-kube-api-access-mchqp\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.224605 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.224560 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-service-ca\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.224605 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.224595 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-oauth-config\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.224776 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.224626 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-console-config\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.224776 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.224669 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-serving-cert\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.325950 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.325917 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-oauth-serving-cert\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.325950 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.325951 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mchqp\" (UniqueName: \"kubernetes.io/projected/2ec71445-b718-44da-b784-516ee20294b6-kube-api-access-mchqp\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.326165 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.325975 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-service-ca\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.326165 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.325995 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-oauth-config\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.326165 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.326074 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-console-config\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.326165 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.326140 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-serving-cert\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.326859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.326785 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-oauth-serving-cert\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.326986 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.326899 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-console-config\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.327072 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.326994 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-service-ca\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.328805 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.328785 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-oauth-config\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.328988 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.328967 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-serving-cert\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.334010 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.333992 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mchqp\" (UniqueName: \"kubernetes.io/projected/2ec71445-b718-44da-b784-516ee20294b6-kube-api-access-mchqp\") pod \"console-bccdb78fc-wlsht\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:28.497926 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:28.497895 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:29:35.008153 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:35.008127 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bccdb78fc-wlsht"] Apr 20 14:29:35.017817 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:29:35.017791 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ec71445_b718_44da_b784_516ee20294b6.slice/crio-b3a010a11c9411b5583998949c782b309cf46fe16d1f3920f5098acc70719a15 WatchSource:0}: Error finding container b3a010a11c9411b5583998949c782b309cf46fe16d1f3920f5098acc70719a15: Status 404 returned error can't find the container with id b3a010a11c9411b5583998949c782b309cf46fe16d1f3920f5098acc70719a15 Apr 20 14:29:35.910323 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:35.910206 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bccdb78fc-wlsht" event={"ID":"2ec71445-b718-44da-b784-516ee20294b6","Type":"ContainerStarted","Data":"b3a010a11c9411b5583998949c782b309cf46fe16d1f3920f5098acc70719a15"} Apr 20 14:29:35.912247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:35.912164 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rqnwg" event={"ID":"274e2639-2488-4c0c-a9f2-129da2913092","Type":"ContainerStarted","Data":"bc2bcf7c1ba96660a8ce509a1a718ebda291431bc0a3f20af456abff7cdd7cca"} Apr 20 14:29:35.913189 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:35.913165 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rqnwg" Apr 20 14:29:35.924917 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:35.924843 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rqnwg" Apr 20 14:29:35.932035 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:35.931991 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rqnwg" podStartSLOduration=1.4411367880000001 podStartE2EDuration="16.931974396s" podCreationTimestamp="2026-04-20 14:29:19 +0000 UTC" firstStartedPulling="2026-04-20 14:29:19.491669757 +0000 UTC m=+139.671158203" lastFinishedPulling="2026-04-20 14:29:34.98250731 +0000 UTC m=+155.161995811" observedRunningTime="2026-04-20 14:29:35.929294493 +0000 UTC m=+156.108782962" watchObservedRunningTime="2026-04-20 14:29:35.931974396 +0000 UTC m=+156.111462865" Apr 20 14:29:37.355836 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.354649 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64bcf8fcc6-f6lsk"] Apr 20 14:29:37.362248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.362163 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.367226 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.367203 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64bcf8fcc6-f6lsk"] Apr 20 14:29:37.373377 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.371459 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 14:29:37.398254 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.398181 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-serving-cert\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.398254 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.398230 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-oauth-serving-cert\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.398426 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.398263 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-config\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.398426 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.398302 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-service-ca\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.398426 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.398331 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snhqw\" (UniqueName: \"kubernetes.io/projected/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-kube-api-access-snhqw\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.398426 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.398355 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-oauth-config\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.398426 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.398418 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-trusted-ca-bundle\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.500810 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.499887 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-serving-cert\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.500810 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.500798 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-oauth-serving-cert\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.501021 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.500868 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-config\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.501021 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.500941 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-service-ca\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.501021 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.500971 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snhqw\" (UniqueName: \"kubernetes.io/projected/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-kube-api-access-snhqw\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.501021 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.501000 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-oauth-config\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.501227 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.501048 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-trusted-ca-bundle\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.502552 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.501433 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-oauth-serving-cert\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.502552 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.502269 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-service-ca\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.502552 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.502503 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-config\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.503639 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.503613 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-serving-cert\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.505143 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.505121 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-oauth-config\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.508113 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.508071 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-trusted-ca-bundle\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.510834 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.510769 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snhqw\" (UniqueName: \"kubernetes.io/projected/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-kube-api-access-snhqw\") pod \"console-64bcf8fcc6-f6lsk\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:37.677644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:37.677549 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:38.805285 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:38.802441 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64bcf8fcc6-f6lsk"] Apr 20 14:29:38.806747 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:29:38.806602 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f6c12c3_4870_43b4_b65f_988c9d9e9c86.slice/crio-574f015ff4379fdd398c4396259c36a03dc637690f16c190eefdc283898e12b4 WatchSource:0}: Error finding container 574f015ff4379fdd398c4396259c36a03dc637690f16c190eefdc283898e12b4: Status 404 returned error can't find the container with id 574f015ff4379fdd398c4396259c36a03dc637690f16c190eefdc283898e12b4 Apr 20 14:29:38.923806 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:38.923719 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64bcf8fcc6-f6lsk" event={"ID":"6f6c12c3-4870-43b4-b65f-988c9d9e9c86","Type":"ContainerStarted","Data":"574f015ff4379fdd398c4396259c36a03dc637690f16c190eefdc283898e12b4"} Apr 20 14:29:39.928287 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:39.928249 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bccdb78fc-wlsht" event={"ID":"2ec71445-b718-44da-b784-516ee20294b6","Type":"ContainerStarted","Data":"5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700"} Apr 20 14:29:39.929910 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:39.929879 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64bcf8fcc6-f6lsk" event={"ID":"6f6c12c3-4870-43b4-b65f-988c9d9e9c86","Type":"ContainerStarted","Data":"2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591"} Apr 20 14:29:39.947508 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:39.947466 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bccdb78fc-wlsht" podStartSLOduration=7.89082437 podStartE2EDuration="11.94745467s" podCreationTimestamp="2026-04-20 14:29:28 +0000 UTC" firstStartedPulling="2026-04-20 14:29:35.025123828 +0000 UTC m=+155.204612288" lastFinishedPulling="2026-04-20 14:29:39.081754126 +0000 UTC m=+159.261242588" observedRunningTime="2026-04-20 14:29:39.946438205 +0000 UTC m=+160.125926673" watchObservedRunningTime="2026-04-20 14:29:39.94745467 +0000 UTC m=+160.126943139" Apr 20 14:29:39.964808 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:39.964766 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64bcf8fcc6-f6lsk" podStartSLOduration=2.372278053 podStartE2EDuration="2.964751431s" podCreationTimestamp="2026-04-20 14:29:37 +0000 UTC" firstStartedPulling="2026-04-20 14:29:38.80906419 +0000 UTC m=+158.988552635" lastFinishedPulling="2026-04-20 14:29:39.401537559 +0000 UTC m=+159.581026013" observedRunningTime="2026-04-20 14:29:39.962900169 +0000 UTC m=+160.142388638" watchObservedRunningTime="2026-04-20 14:29:39.964751431 +0000 UTC m=+160.144239898" Apr 20 14:29:42.940610 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:42.940570 2581 generic.go:358] "Generic (PLEG): container finished" podID="a67564e3-e8db-4d6a-a8a4-591a0e2cf642" containerID="5a6ab794f637d54e1056a3932997e897038ea698ae3ee36e23792ba19b4ee5f7" exitCode=0 Apr 20 14:29:42.941262 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:42.940620 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6hcxh" event={"ID":"a67564e3-e8db-4d6a-a8a4-591a0e2cf642","Type":"ContainerDied","Data":"5a6ab794f637d54e1056a3932997e897038ea698ae3ee36e23792ba19b4ee5f7"} Apr 20 14:29:42.941262 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:42.941056 2581 scope.go:117] "RemoveContainer" containerID="5a6ab794f637d54e1056a3932997e897038ea698ae3ee36e23792ba19b4ee5f7" Apr 20 14:29:43.759979 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:43.759948 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5qgnp_ce3a0704-031d-43e8-87ac-e53039d6f376/dns/0.log" Apr 20 14:29:43.945369 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:43.945328 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6hcxh" event={"ID":"a67564e3-e8db-4d6a-a8a4-591a0e2cf642","Type":"ContainerStarted","Data":"c7f5e64031e8eca899670e41cd0aaeead78c10f1cd67b2207e82c41e5547b56c"} Apr 20 14:29:43.958076 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:43.958052 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5qgnp_ce3a0704-031d-43e8-87ac-e53039d6f376/kube-rbac-proxy/0.log" Apr 20 14:29:44.958658 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:44.958630 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lmkks_4828e4f0-6105-42f8-8ec4-54d66f7d101d/dns-node-resolver/0.log" Apr 20 14:29:45.561709 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:45.561685 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7748fc9578-6ldxb_9e88f589-217b-4f8f-a0a4-7289dd42caff/router/0.log" Apr 20 14:29:46.159700 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:46.159673 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-htgwf_e5f2a8a7-39d4-41d8-9ca2-1a049023a466/serve-healthcheck-canary/0.log" Apr 20 14:29:47.677835 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:47.677804 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:47.678213 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:47.677908 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:47.682227 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:47.682204 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:47.965115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:47.965096 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:29:48.014463 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:48.014437 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bccdb78fc-wlsht"] Apr 20 14:29:48.498715 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:29:48.498688 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:30:13.034217 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.034159 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bccdb78fc-wlsht" podUID="2ec71445-b718-44da-b784-516ee20294b6" containerName="console" containerID="cri-o://5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700" gracePeriod=15 Apr 20 14:30:13.312226 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.312199 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bccdb78fc-wlsht_2ec71445-b718-44da-b784-516ee20294b6/console/0.log" Apr 20 14:30:13.312350 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.312284 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:30:13.478546 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.478517 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mchqp\" (UniqueName: \"kubernetes.io/projected/2ec71445-b718-44da-b784-516ee20294b6-kube-api-access-mchqp\") pod \"2ec71445-b718-44da-b784-516ee20294b6\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " Apr 20 14:30:13.478695 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.478551 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-service-ca\") pod \"2ec71445-b718-44da-b784-516ee20294b6\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " Apr 20 14:30:13.478695 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.478571 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-serving-cert\") pod \"2ec71445-b718-44da-b784-516ee20294b6\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " Apr 20 14:30:13.478695 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.478592 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-oauth-config\") pod \"2ec71445-b718-44da-b784-516ee20294b6\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " Apr 20 14:30:13.478695 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.478647 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-console-config\") pod \"2ec71445-b718-44da-b784-516ee20294b6\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " Apr 20 14:30:13.478938 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.478708 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-oauth-serving-cert\") pod \"2ec71445-b718-44da-b784-516ee20294b6\" (UID: \"2ec71445-b718-44da-b784-516ee20294b6\") " Apr 20 14:30:13.478994 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.478968 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-service-ca" (OuterVolumeSpecName: "service-ca") pod "2ec71445-b718-44da-b784-516ee20294b6" (UID: "2ec71445-b718-44da-b784-516ee20294b6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:13.479187 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.479163 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-console-config" (OuterVolumeSpecName: "console-config") pod "2ec71445-b718-44da-b784-516ee20294b6" (UID: "2ec71445-b718-44da-b784-516ee20294b6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:13.479258 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.479199 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2ec71445-b718-44da-b784-516ee20294b6" (UID: "2ec71445-b718-44da-b784-516ee20294b6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:13.481064 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.481038 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2ec71445-b718-44da-b784-516ee20294b6" (UID: "2ec71445-b718-44da-b784-516ee20294b6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:13.481265 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.481245 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2ec71445-b718-44da-b784-516ee20294b6" (UID: "2ec71445-b718-44da-b784-516ee20294b6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:13.481265 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.481248 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec71445-b718-44da-b784-516ee20294b6-kube-api-access-mchqp" (OuterVolumeSpecName: "kube-api-access-mchqp") pod "2ec71445-b718-44da-b784-516ee20294b6" (UID: "2ec71445-b718-44da-b784-516ee20294b6"). InnerVolumeSpecName "kube-api-access-mchqp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:30:13.579704 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.579649 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-oauth-config\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:30:13.579704 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.579673 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-console-config\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:30:13.579704 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.579686 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-oauth-serving-cert\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:30:13.579704 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.579700 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mchqp\" (UniqueName: \"kubernetes.io/projected/2ec71445-b718-44da-b784-516ee20294b6-kube-api-access-mchqp\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:30:13.579900 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.579714 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2ec71445-b718-44da-b784-516ee20294b6-service-ca\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:30:13.579900 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:13.579751 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec71445-b718-44da-b784-516ee20294b6-console-serving-cert\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:30:14.033751 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.033700 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bccdb78fc-wlsht_2ec71445-b718-44da-b784-516ee20294b6/console/0.log" Apr 20 14:30:14.033913 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.033757 2581 generic.go:358] "Generic (PLEG): container finished" podID="2ec71445-b718-44da-b784-516ee20294b6" containerID="5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700" exitCode=2 Apr 20 14:30:14.033913 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.033788 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bccdb78fc-wlsht" event={"ID":"2ec71445-b718-44da-b784-516ee20294b6","Type":"ContainerDied","Data":"5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700"} Apr 20 14:30:14.033913 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.033826 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bccdb78fc-wlsht" Apr 20 14:30:14.033913 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.033842 2581 scope.go:117] "RemoveContainer" containerID="5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700" Apr 20 14:30:14.034076 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.033830 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bccdb78fc-wlsht" event={"ID":"2ec71445-b718-44da-b784-516ee20294b6","Type":"ContainerDied","Data":"b3a010a11c9411b5583998949c782b309cf46fe16d1f3920f5098acc70719a15"} Apr 20 14:30:14.042297 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.042141 2581 scope.go:117] "RemoveContainer" containerID="5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700" Apr 20 14:30:14.042513 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:30:14.042345 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700\": container with ID starting with 5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700 not found: ID does not exist" containerID="5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700" Apr 20 14:30:14.042513 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.042369 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700"} err="failed to get container status \"5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700\": rpc error: code = NotFound desc = could not find container \"5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700\": container with ID starting with 5d707954e00a528ac65c0154f0150818a5b0284e67ae82fe9e684b0a5c922700 not found: ID does not exist" Apr 20 14:30:14.054376 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.054354 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bccdb78fc-wlsht"] Apr 20 14:30:14.059889 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.059870 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bccdb78fc-wlsht"] Apr 20 14:30:14.386381 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:14.386295 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec71445-b718-44da-b784-516ee20294b6" path="/var/lib/kubelet/pods/2ec71445-b718-44da-b784-516ee20294b6/volumes" Apr 20 14:30:33.533342 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.533312 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-764566f897-jfm8q"] Apr 20 14:30:33.533864 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.533585 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ec71445-b718-44da-b784-516ee20294b6" containerName="console" Apr 20 14:30:33.533864 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.533599 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec71445-b718-44da-b784-516ee20294b6" containerName="console" Apr 20 14:30:33.533864 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.533638 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ec71445-b718-44da-b784-516ee20294b6" containerName="console" Apr 20 14:30:33.538285 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.538269 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.552597 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.552570 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764566f897-jfm8q"] Apr 20 14:30:33.623486 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.623465 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-config\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.623596 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.623498 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-trusted-ca-bundle\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.623596 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.623519 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-oauth-serving-cert\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.623596 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.623579 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-service-ca\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.623751 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.623597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-serving-cert\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.623751 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.623647 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-oauth-config\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.623751 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.623676 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54c48\" (UniqueName: \"kubernetes.io/projected/76a55b08-d062-4b14-8e68-dbcffa64f3c9-kube-api-access-54c48\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.723951 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.723926 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-service-ca\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.724062 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.723956 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-serving-cert\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.724062 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.723979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-oauth-config\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.724062 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.723994 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54c48\" (UniqueName: \"kubernetes.io/projected/76a55b08-d062-4b14-8e68-dbcffa64f3c9-kube-api-access-54c48\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.724062 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.724021 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-config\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.724062 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.724041 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-trusted-ca-bundle\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.724291 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.724168 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-oauth-serving-cert\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.724748 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.724701 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-config\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.724854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.724839 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-service-ca\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.725066 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.725043 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-oauth-serving-cert\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.725106 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.725050 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-trusted-ca-bundle\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.726650 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.726621 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-serving-cert\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.726767 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.726672 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-oauth-config\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.734778 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.734759 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54c48\" (UniqueName: \"kubernetes.io/projected/76a55b08-d062-4b14-8e68-dbcffa64f3c9-kube-api-access-54c48\") pod \"console-764566f897-jfm8q\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.848015 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.847948 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:33.970707 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:33.970666 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-764566f897-jfm8q"] Apr 20 14:30:33.973545 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:30:33.973520 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76a55b08_d062_4b14_8e68_dbcffa64f3c9.slice/crio-e2467029c5a1783329ef9733a5799427c6b3c1d7b53e823639ce1eef68cf347b WatchSource:0}: Error finding container e2467029c5a1783329ef9733a5799427c6b3c1d7b53e823639ce1eef68cf347b: Status 404 returned error can't find the container with id e2467029c5a1783329ef9733a5799427c6b3c1d7b53e823639ce1eef68cf347b Apr 20 14:30:34.088707 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:34.088668 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764566f897-jfm8q" event={"ID":"76a55b08-d062-4b14-8e68-dbcffa64f3c9","Type":"ContainerStarted","Data":"2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb"} Apr 20 14:30:34.088886 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:34.088717 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764566f897-jfm8q" event={"ID":"76a55b08-d062-4b14-8e68-dbcffa64f3c9","Type":"ContainerStarted","Data":"e2467029c5a1783329ef9733a5799427c6b3c1d7b53e823639ce1eef68cf347b"} Apr 20 14:30:34.110114 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:34.110026 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-764566f897-jfm8q" podStartSLOduration=1.110012493 podStartE2EDuration="1.110012493s" podCreationTimestamp="2026-04-20 14:30:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:30:34.108456316 +0000 UTC m=+214.287944784" watchObservedRunningTime="2026-04-20 14:30:34.110012493 +0000 UTC m=+214.289500961" Apr 20 14:30:43.848196 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:43.848162 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:43.848668 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:43.848248 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:43.852562 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:43.852543 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:44.120514 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:44.120434 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:30:44.171184 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:30:44.171149 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64bcf8fcc6-f6lsk"] Apr 20 14:31:09.193788 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.193642 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64bcf8fcc6-f6lsk" podUID="6f6c12c3-4870-43b4-b65f-988c9d9e9c86" containerName="console" containerID="cri-o://2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591" gracePeriod=15 Apr 20 14:31:09.428986 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.428964 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64bcf8fcc6-f6lsk_6f6c12c3-4870-43b4-b65f-988c9d9e9c86/console/0.log" Apr 20 14:31:09.429098 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.429023 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:31:09.473142 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473113 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-trusted-ca-bundle\") pod \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " Apr 20 14:31:09.473295 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473178 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-oauth-serving-cert\") pod \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " Apr 20 14:31:09.473295 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473217 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snhqw\" (UniqueName: \"kubernetes.io/projected/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-kube-api-access-snhqw\") pod \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " Apr 20 14:31:09.473295 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473238 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-config\") pod \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " Apr 20 14:31:09.473295 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473259 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-service-ca\") pod \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " Apr 20 14:31:09.473295 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473276 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-oauth-config\") pod \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " Apr 20 14:31:09.473550 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473300 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-serving-cert\") pod \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\" (UID: \"6f6c12c3-4870-43b4-b65f-988c9d9e9c86\") " Apr 20 14:31:09.473652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473620 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6f6c12c3-4870-43b4-b65f-988c9d9e9c86" (UID: "6f6c12c3-4870-43b4-b65f-988c9d9e9c86"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:31:09.473652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473636 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6f6c12c3-4870-43b4-b65f-988c9d9e9c86" (UID: "6f6c12c3-4870-43b4-b65f-988c9d9e9c86"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:31:09.473791 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473706 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-config" (OuterVolumeSpecName: "console-config") pod "6f6c12c3-4870-43b4-b65f-988c9d9e9c86" (UID: "6f6c12c3-4870-43b4-b65f-988c9d9e9c86"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:31:09.473791 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.473741 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-service-ca" (OuterVolumeSpecName: "service-ca") pod "6f6c12c3-4870-43b4-b65f-988c9d9e9c86" (UID: "6f6c12c3-4870-43b4-b65f-988c9d9e9c86"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:31:09.475613 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.475589 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-kube-api-access-snhqw" (OuterVolumeSpecName: "kube-api-access-snhqw") pod "6f6c12c3-4870-43b4-b65f-988c9d9e9c86" (UID: "6f6c12c3-4870-43b4-b65f-988c9d9e9c86"). InnerVolumeSpecName "kube-api-access-snhqw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:31:09.475809 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.475791 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6f6c12c3-4870-43b4-b65f-988c9d9e9c86" (UID: "6f6c12c3-4870-43b4-b65f-988c9d9e9c86"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:31:09.475877 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.475862 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6f6c12c3-4870-43b4-b65f-988c9d9e9c86" (UID: "6f6c12c3-4870-43b4-b65f-988c9d9e9c86"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:31:09.574847 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.574817 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snhqw\" (UniqueName: \"kubernetes.io/projected/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-kube-api-access-snhqw\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:31:09.574847 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.574842 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-config\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:31:09.574847 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.574854 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-service-ca\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:31:09.575039 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.574862 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-oauth-config\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:31:09.575039 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.574871 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-console-serving-cert\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:31:09.575039 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.574880 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-trusted-ca-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:31:09.575039 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:09.574888 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6f6c12c3-4870-43b4-b65f-988c9d9e9c86-oauth-serving-cert\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:31:10.187459 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.187433 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64bcf8fcc6-f6lsk_6f6c12c3-4870-43b4-b65f-988c9d9e9c86/console/0.log" Apr 20 14:31:10.187652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.187474 2581 generic.go:358] "Generic (PLEG): container finished" podID="6f6c12c3-4870-43b4-b65f-988c9d9e9c86" containerID="2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591" exitCode=2 Apr 20 14:31:10.187652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.187545 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64bcf8fcc6-f6lsk" Apr 20 14:31:10.187652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.187578 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64bcf8fcc6-f6lsk" event={"ID":"6f6c12c3-4870-43b4-b65f-988c9d9e9c86","Type":"ContainerDied","Data":"2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591"} Apr 20 14:31:10.187652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.187628 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64bcf8fcc6-f6lsk" event={"ID":"6f6c12c3-4870-43b4-b65f-988c9d9e9c86","Type":"ContainerDied","Data":"574f015ff4379fdd398c4396259c36a03dc637690f16c190eefdc283898e12b4"} Apr 20 14:31:10.187652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.187649 2581 scope.go:117] "RemoveContainer" containerID="2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591" Apr 20 14:31:10.196915 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.196736 2581 scope.go:117] "RemoveContainer" containerID="2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591" Apr 20 14:31:10.197126 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:31:10.196981 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591\": container with ID starting with 2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591 not found: ID does not exist" containerID="2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591" Apr 20 14:31:10.197126 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.197005 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591"} err="failed to get container status \"2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591\": rpc error: code = NotFound desc = could not find container \"2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591\": container with ID starting with 2e385301e8ff0e4fb00b1af25e9c2818216861fc69a759d370bd50cbdf09a591 not found: ID does not exist" Apr 20 14:31:10.207819 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.207797 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64bcf8fcc6-f6lsk"] Apr 20 14:31:10.211826 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.211806 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64bcf8fcc6-f6lsk"] Apr 20 14:31:10.386131 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:10.386103 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6c12c3-4870-43b4-b65f-988c9d9e9c86" path="/var/lib/kubelet/pods/6f6c12c3-4870-43b4-b65f-988c9d9e9c86/volumes" Apr 20 14:31:56.344300 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.344270 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84df7bcd6c-5t7v8"] Apr 20 14:31:56.344878 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.344768 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f6c12c3-4870-43b4-b65f-988c9d9e9c86" containerName="console" Apr 20 14:31:56.344878 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.344791 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6c12c3-4870-43b4-b65f-988c9d9e9c86" containerName="console" Apr 20 14:31:56.345004 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.344883 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f6c12c3-4870-43b4-b65f-988c9d9e9c86" containerName="console" Apr 20 14:31:56.347788 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.347767 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.357088 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.357064 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84df7bcd6c-5t7v8"] Apr 20 14:31:56.489223 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.489185 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-trusted-ca-bundle\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.489388 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.489247 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdmc\" (UniqueName: \"kubernetes.io/projected/6853c866-17da-4ff6-99a0-2bdc83042e36-kube-api-access-pvdmc\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.489388 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.489324 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-oauth-config\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.489388 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.489364 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-serving-cert\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.489388 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.489385 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-service-ca\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.489545 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.489418 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-console-config\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.489545 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.489445 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-oauth-serving-cert\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.590740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.590702 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdmc\" (UniqueName: \"kubernetes.io/projected/6853c866-17da-4ff6-99a0-2bdc83042e36-kube-api-access-pvdmc\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.590900 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.590768 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-oauth-config\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.590900 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.590794 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-serving-cert\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.590900 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.590815 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-service-ca\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.590900 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.590830 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-console-config\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.590900 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.590848 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-oauth-serving-cert\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.591169 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.590900 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-trusted-ca-bundle\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.591632 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.591606 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-console-config\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.591754 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.591607 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-service-ca\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.591754 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.591647 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-oauth-serving-cert\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.592053 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.592029 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-trusted-ca-bundle\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.593125 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.593100 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-serving-cert\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.593280 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.593263 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-oauth-config\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.600127 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.600078 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdmc\" (UniqueName: \"kubernetes.io/projected/6853c866-17da-4ff6-99a0-2bdc83042e36-kube-api-access-pvdmc\") pod \"console-84df7bcd6c-5t7v8\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.657560 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.657530 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:31:56.781603 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:56.781579 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84df7bcd6c-5t7v8"] Apr 20 14:31:56.784230 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:31:56.784199 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6853c866_17da_4ff6_99a0_2bdc83042e36.slice/crio-bf63a60d4ef515571073de2227cf8ea75823c4bab551547000bf8def9497762b WatchSource:0}: Error finding container bf63a60d4ef515571073de2227cf8ea75823c4bab551547000bf8def9497762b: Status 404 returned error can't find the container with id bf63a60d4ef515571073de2227cf8ea75823c4bab551547000bf8def9497762b Apr 20 14:31:57.310421 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:57.310380 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84df7bcd6c-5t7v8" event={"ID":"6853c866-17da-4ff6-99a0-2bdc83042e36","Type":"ContainerStarted","Data":"0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2"} Apr 20 14:31:57.310421 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:57.310425 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84df7bcd6c-5t7v8" event={"ID":"6853c866-17da-4ff6-99a0-2bdc83042e36","Type":"ContainerStarted","Data":"bf63a60d4ef515571073de2227cf8ea75823c4bab551547000bf8def9497762b"} Apr 20 14:31:57.329132 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:31:57.329080 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84df7bcd6c-5t7v8" podStartSLOduration=1.329065534 podStartE2EDuration="1.329065534s" podCreationTimestamp="2026-04-20 14:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:31:57.327768785 +0000 UTC m=+297.507257256" watchObservedRunningTime="2026-04-20 14:31:57.329065534 +0000 UTC m=+297.508554004" Apr 20 14:32:00.256082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:00.256051 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:32:00.257706 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:00.257680 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:32:00.262031 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:00.262011 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:32:00.263267 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:00.263249 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:32:00.265362 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:00.265346 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 14:32:06.658421 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:06.658390 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:32:06.658421 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:06.658423 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:32:06.663176 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:06.663160 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:32:07.340268 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:07.340240 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:32:07.388078 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:07.388047 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-764566f897-jfm8q"] Apr 20 14:32:19.643578 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.643546 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-4tqnb"] Apr 20 14:32:19.646583 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.646563 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.651317 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.651295 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:32:19.659347 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.659324 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4tqnb"] Apr 20 14:32:19.758056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.758025 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-original-pull-secret\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.758173 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.758079 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-dbus\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.758220 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.758165 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-kubelet-config\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.858888 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.858860 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-original-pull-secret\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.859001 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.858911 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-dbus\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.859001 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.858961 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-kubelet-config\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.859118 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.859060 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-kubelet-config\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.859118 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.859109 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-dbus\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.861284 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.861265 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a0fe6fd9-cb83-4fb9-94b4-1115d7285c71-original-pull-secret\") pod \"global-pull-secret-syncer-4tqnb\" (UID: \"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71\") " pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:19.955399 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:19.955329 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-4tqnb" Apr 20 14:32:20.069118 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:20.069097 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-4tqnb"] Apr 20 14:32:20.071565 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:32:20.071533 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0fe6fd9_cb83_4fb9_94b4_1115d7285c71.slice/crio-284bd388a7ab4726d054f4eaa2df730799c66f3a1a6b382b3ba475cbc81a8eed WatchSource:0}: Error finding container 284bd388a7ab4726d054f4eaa2df730799c66f3a1a6b382b3ba475cbc81a8eed: Status 404 returned error can't find the container with id 284bd388a7ab4726d054f4eaa2df730799c66f3a1a6b382b3ba475cbc81a8eed Apr 20 14:32:20.073123 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:20.073103 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:32:20.369629 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:20.369589 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4tqnb" event={"ID":"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71","Type":"ContainerStarted","Data":"284bd388a7ab4726d054f4eaa2df730799c66f3a1a6b382b3ba475cbc81a8eed"} Apr 20 14:32:26.388532 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:26.388497 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-4tqnb" event={"ID":"a0fe6fd9-cb83-4fb9-94b4-1115d7285c71","Type":"ContainerStarted","Data":"4607a61aba16467b6013dcc6fcc6ac96c1049fc5a6df05b4ced98ed2b0a28f86"} Apr 20 14:32:26.405084 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:26.405028 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-4tqnb" podStartSLOduration=1.933564469 podStartE2EDuration="7.405008663s" podCreationTimestamp="2026-04-20 14:32:19 +0000 UTC" firstStartedPulling="2026-04-20 14:32:20.073229941 +0000 UTC m=+320.252718388" lastFinishedPulling="2026-04-20 14:32:25.544674136 +0000 UTC m=+325.724162582" observedRunningTime="2026-04-20 14:32:26.404335062 +0000 UTC m=+326.583823531" watchObservedRunningTime="2026-04-20 14:32:26.405008663 +0000 UTC m=+326.584497132" Apr 20 14:32:32.413087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.413029 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-764566f897-jfm8q" podUID="76a55b08-d062-4b14-8e68-dbcffa64f3c9" containerName="console" containerID="cri-o://2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb" gracePeriod=15 Apr 20 14:32:32.645327 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.645306 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764566f897-jfm8q_76a55b08-d062-4b14-8e68-dbcffa64f3c9/console/0.log" Apr 20 14:32:32.645427 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.645377 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:32:32.752452 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.752423 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-serving-cert\") pod \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " Apr 20 14:32:32.752452 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.752453 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-oauth-config\") pod \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " Apr 20 14:32:32.752647 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.752477 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-trusted-ca-bundle\") pod \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " Apr 20 14:32:32.752647 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.752556 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-oauth-serving-cert\") pod \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " Apr 20 14:32:32.752647 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.752582 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-service-ca\") pod \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " Apr 20 14:32:32.752647 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.752612 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54c48\" (UniqueName: \"kubernetes.io/projected/76a55b08-d062-4b14-8e68-dbcffa64f3c9-kube-api-access-54c48\") pod \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " Apr 20 14:32:32.752909 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.752649 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-config\") pod \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\" (UID: \"76a55b08-d062-4b14-8e68-dbcffa64f3c9\") " Apr 20 14:32:32.753054 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.752997 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "76a55b08-d062-4b14-8e68-dbcffa64f3c9" (UID: "76a55b08-d062-4b14-8e68-dbcffa64f3c9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:32:32.753054 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.753007 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-service-ca" (OuterVolumeSpecName: "service-ca") pod "76a55b08-d062-4b14-8e68-dbcffa64f3c9" (UID: "76a55b08-d062-4b14-8e68-dbcffa64f3c9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:32:32.753054 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.753013 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "76a55b08-d062-4b14-8e68-dbcffa64f3c9" (UID: "76a55b08-d062-4b14-8e68-dbcffa64f3c9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:32:32.753054 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.753043 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-config" (OuterVolumeSpecName: "console-config") pod "76a55b08-d062-4b14-8e68-dbcffa64f3c9" (UID: "76a55b08-d062-4b14-8e68-dbcffa64f3c9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:32:32.754951 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.754926 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "76a55b08-d062-4b14-8e68-dbcffa64f3c9" (UID: "76a55b08-d062-4b14-8e68-dbcffa64f3c9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:32:32.755043 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.755001 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "76a55b08-d062-4b14-8e68-dbcffa64f3c9" (UID: "76a55b08-d062-4b14-8e68-dbcffa64f3c9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:32:32.755158 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.755138 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a55b08-d062-4b14-8e68-dbcffa64f3c9-kube-api-access-54c48" (OuterVolumeSpecName: "kube-api-access-54c48") pod "76a55b08-d062-4b14-8e68-dbcffa64f3c9" (UID: "76a55b08-d062-4b14-8e68-dbcffa64f3c9"). InnerVolumeSpecName "kube-api-access-54c48". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:32:32.853435 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.853399 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-trusted-ca-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:32:32.853435 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.853432 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-oauth-serving-cert\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:32:32.853435 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.853442 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-service-ca\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:32:32.853593 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.853451 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-54c48\" (UniqueName: \"kubernetes.io/projected/76a55b08-d062-4b14-8e68-dbcffa64f3c9-kube-api-access-54c48\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:32:32.853593 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.853460 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-config\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:32:32.853593 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.853469 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-serving-cert\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:32:32.853593 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:32.853477 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76a55b08-d062-4b14-8e68-dbcffa64f3c9-console-oauth-config\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:32:33.409315 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.409285 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-764566f897-jfm8q_76a55b08-d062-4b14-8e68-dbcffa64f3c9/console/0.log" Apr 20 14:32:33.409480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.409324 2581 generic.go:358] "Generic (PLEG): container finished" podID="76a55b08-d062-4b14-8e68-dbcffa64f3c9" containerID="2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb" exitCode=2 Apr 20 14:32:33.409480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.409380 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-764566f897-jfm8q" Apr 20 14:32:33.409480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.409413 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764566f897-jfm8q" event={"ID":"76a55b08-d062-4b14-8e68-dbcffa64f3c9","Type":"ContainerDied","Data":"2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb"} Apr 20 14:32:33.409480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.409449 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-764566f897-jfm8q" event={"ID":"76a55b08-d062-4b14-8e68-dbcffa64f3c9","Type":"ContainerDied","Data":"e2467029c5a1783329ef9733a5799427c6b3c1d7b53e823639ce1eef68cf347b"} Apr 20 14:32:33.409480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.409465 2581 scope.go:117] "RemoveContainer" containerID="2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb" Apr 20 14:32:33.418636 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.418440 2581 scope.go:117] "RemoveContainer" containerID="2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb" Apr 20 14:32:33.418928 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:32:33.418715 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb\": container with ID starting with 2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb not found: ID does not exist" containerID="2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb" Apr 20 14:32:33.418928 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.418761 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb"} err="failed to get container status \"2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb\": rpc error: code = NotFound desc = could not find container \"2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb\": container with ID starting with 2d19087872ff33ed1c484c8f50f83479d43eb2855b0cc8c856c1729c0d297cfb not found: ID does not exist" Apr 20 14:32:33.432969 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.432947 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-764566f897-jfm8q"] Apr 20 14:32:33.436300 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:33.436273 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-764566f897-jfm8q"] Apr 20 14:32:34.387169 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:32:34.387133 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76a55b08-d062-4b14-8e68-dbcffa64f3c9" path="/var/lib/kubelet/pods/76a55b08-d062-4b14-8e68-dbcffa64f3c9/volumes" Apr 20 14:33:53.670808 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.670766 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5"] Apr 20 14:33:53.671436 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.671079 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76a55b08-d062-4b14-8e68-dbcffa64f3c9" containerName="console" Apr 20 14:33:53.671436 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.671093 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a55b08-d062-4b14-8e68-dbcffa64f3c9" containerName="console" Apr 20 14:33:53.671436 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.671150 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="76a55b08-d062-4b14-8e68-dbcffa64f3c9" containerName="console" Apr 20 14:33:53.674215 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.674197 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.679138 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.679113 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:33:53.680110 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.680088 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:33:53.680110 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.680100 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-75g9b\"" Apr 20 14:33:53.687087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.687063 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5"] Apr 20 14:33:53.811973 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.811943 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfr2t\" (UniqueName: \"kubernetes.io/projected/a72d0aed-d780-4d40-a64d-48fcda57e564-kube-api-access-xfr2t\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.812097 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.811983 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.812097 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.812020 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.913217 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.913188 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfr2t\" (UniqueName: \"kubernetes.io/projected/a72d0aed-d780-4d40-a64d-48fcda57e564-kube-api-access-xfr2t\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.913314 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.913237 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.913314 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.913270 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.913595 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.913576 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.913635 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.913598 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.921492 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.921435 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfr2t\" (UniqueName: \"kubernetes.io/projected/a72d0aed-d780-4d40-a64d-48fcda57e564-kube-api-access-xfr2t\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:53.983960 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:53.983931 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:33:54.105541 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:54.105518 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5"] Apr 20 14:33:54.108224 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:33:54.108194 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda72d0aed_d780_4d40_a64d_48fcda57e564.slice/crio-13136053ab31c81c4cf65a7178d1e11873df3e9448326a1bb0e1d4c334fd7805 WatchSource:0}: Error finding container 13136053ab31c81c4cf65a7178d1e11873df3e9448326a1bb0e1d4c334fd7805: Status 404 returned error can't find the container with id 13136053ab31c81c4cf65a7178d1e11873df3e9448326a1bb0e1d4c334fd7805 Apr 20 14:33:54.629959 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:54.629919 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" event={"ID":"a72d0aed-d780-4d40-a64d-48fcda57e564","Type":"ContainerStarted","Data":"13136053ab31c81c4cf65a7178d1e11873df3e9448326a1bb0e1d4c334fd7805"} Apr 20 14:33:59.644779 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:59.644719 2581 generic.go:358] "Generic (PLEG): container finished" podID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerID="98baa16e8d9144425502dd8182ed7486504536e3488832fb5cc0a2306c6eac3a" exitCode=0 Apr 20 14:33:59.645169 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:33:59.644785 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" event={"ID":"a72d0aed-d780-4d40-a64d-48fcda57e564","Type":"ContainerDied","Data":"98baa16e8d9144425502dd8182ed7486504536e3488832fb5cc0a2306c6eac3a"} Apr 20 14:34:02.654760 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:02.654696 2581 generic.go:358] "Generic (PLEG): container finished" podID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerID="d0cc31a4d9eee6e2af8a67aff1b95e7982defe31f6136d9ed525dad1464b74e8" exitCode=0 Apr 20 14:34:02.655204 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:02.654775 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" event={"ID":"a72d0aed-d780-4d40-a64d-48fcda57e564","Type":"ContainerDied","Data":"d0cc31a4d9eee6e2af8a67aff1b95e7982defe31f6136d9ed525dad1464b74e8"} Apr 20 14:34:09.676148 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:09.676112 2581 generic.go:358] "Generic (PLEG): container finished" podID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerID="67dfe51c7fa4b7722e724b43d54e9e598249ac1bcd012938ed68264c407f127b" exitCode=0 Apr 20 14:34:09.676507 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:09.676166 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" event={"ID":"a72d0aed-d780-4d40-a64d-48fcda57e564","Type":"ContainerDied","Data":"67dfe51c7fa4b7722e724b43d54e9e598249ac1bcd012938ed68264c407f127b"} Apr 20 14:34:10.798584 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.798563 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:34:10.851037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.851009 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-bundle\") pod \"a72d0aed-d780-4d40-a64d-48fcda57e564\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " Apr 20 14:34:10.851158 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.851077 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfr2t\" (UniqueName: \"kubernetes.io/projected/a72d0aed-d780-4d40-a64d-48fcda57e564-kube-api-access-xfr2t\") pod \"a72d0aed-d780-4d40-a64d-48fcda57e564\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " Apr 20 14:34:10.851158 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.851105 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-util\") pod \"a72d0aed-d780-4d40-a64d-48fcda57e564\" (UID: \"a72d0aed-d780-4d40-a64d-48fcda57e564\") " Apr 20 14:34:10.851568 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.851544 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-bundle" (OuterVolumeSpecName: "bundle") pod "a72d0aed-d780-4d40-a64d-48fcda57e564" (UID: "a72d0aed-d780-4d40-a64d-48fcda57e564"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:34:10.853397 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.853372 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72d0aed-d780-4d40-a64d-48fcda57e564-kube-api-access-xfr2t" (OuterVolumeSpecName: "kube-api-access-xfr2t") pod "a72d0aed-d780-4d40-a64d-48fcda57e564" (UID: "a72d0aed-d780-4d40-a64d-48fcda57e564"). InnerVolumeSpecName "kube-api-access-xfr2t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:34:10.854988 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.854970 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-util" (OuterVolumeSpecName: "util") pod "a72d0aed-d780-4d40-a64d-48fcda57e564" (UID: "a72d0aed-d780-4d40-a64d-48fcda57e564"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:34:10.951954 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.951869 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfr2t\" (UniqueName: \"kubernetes.io/projected/a72d0aed-d780-4d40-a64d-48fcda57e564-kube-api-access-xfr2t\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:10.951954 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.951902 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:10.951954 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:10.951914 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a72d0aed-d780-4d40-a64d-48fcda57e564-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:11.683413 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:11.683374 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" event={"ID":"a72d0aed-d780-4d40-a64d-48fcda57e564","Type":"ContainerDied","Data":"13136053ab31c81c4cf65a7178d1e11873df3e9448326a1bb0e1d4c334fd7805"} Apr 20 14:34:11.683413 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:11.683410 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13136053ab31c81c4cf65a7178d1e11873df3e9448326a1bb0e1d4c334fd7805" Apr 20 14:34:11.683618 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:11.683436 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dcjc5" Apr 20 14:34:20.758177 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.758147 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql"] Apr 20 14:34:20.758629 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.758387 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerName="extract" Apr 20 14:34:20.758629 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.758396 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerName="extract" Apr 20 14:34:20.758629 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.758410 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerName="util" Apr 20 14:34:20.758629 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.758415 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerName="util" Apr 20 14:34:20.758629 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.758426 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerName="pull" Apr 20 14:34:20.758629 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.758431 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerName="pull" Apr 20 14:34:20.758629 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.758474 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a72d0aed-d780-4d40-a64d-48fcda57e564" containerName="extract" Apr 20 14:34:20.764872 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.764856 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.767391 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.767358 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:34:20.768316 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.768293 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:34:20.768433 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.768316 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-75g9b\"" Apr 20 14:34:20.769932 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.769907 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql"] Apr 20 14:34:20.816766 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.816741 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.816766 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.816767 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2r6\" (UniqueName: \"kubernetes.io/projected/eee7cf26-2d64-4900-91a3-15977f726cc9-kube-api-access-kp2r6\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.816919 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.816830 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.917777 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.917744 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.917930 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.917795 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.917930 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.917813 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp2r6\" (UniqueName: \"kubernetes.io/projected/eee7cf26-2d64-4900-91a3-15977f726cc9-kube-api-access-kp2r6\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.918094 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.918075 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.918159 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.918143 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:20.928323 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:20.928299 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp2r6\" (UniqueName: \"kubernetes.io/projected/eee7cf26-2d64-4900-91a3-15977f726cc9-kube-api-access-kp2r6\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:21.074794 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:21.074713 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:21.198539 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:21.198513 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql"] Apr 20 14:34:21.200976 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:34:21.200953 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee7cf26_2d64_4900_91a3_15977f726cc9.slice/crio-340f263c392ad49d91860e5b7f4c5baa7c9aaf8f34bd9a7e0da629217e22093f WatchSource:0}: Error finding container 340f263c392ad49d91860e5b7f4c5baa7c9aaf8f34bd9a7e0da629217e22093f: Status 404 returned error can't find the container with id 340f263c392ad49d91860e5b7f4c5baa7c9aaf8f34bd9a7e0da629217e22093f Apr 20 14:34:21.711052 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:21.711011 2581 generic.go:358] "Generic (PLEG): container finished" podID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerID="e92d62df391e10235879f4e49d363035e3a397294cfd5a83a14bd3c2985b39a7" exitCode=0 Apr 20 14:34:21.711253 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:21.711049 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" event={"ID":"eee7cf26-2d64-4900-91a3-15977f726cc9","Type":"ContainerDied","Data":"e92d62df391e10235879f4e49d363035e3a397294cfd5a83a14bd3c2985b39a7"} Apr 20 14:34:21.711253 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:21.711091 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" event={"ID":"eee7cf26-2d64-4900-91a3-15977f726cc9","Type":"ContainerStarted","Data":"340f263c392ad49d91860e5b7f4c5baa7c9aaf8f34bd9a7e0da629217e22093f"} Apr 20 14:34:23.718971 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:23.718942 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" event={"ID":"eee7cf26-2d64-4900-91a3-15977f726cc9","Type":"ContainerStarted","Data":"cd8e284d9f3639be5110d688851356e5e89c64e0630bb06e1651bf8fe4d8f3b7"} Apr 20 14:34:24.725221 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:24.725184 2581 generic.go:358] "Generic (PLEG): container finished" podID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerID="cd8e284d9f3639be5110d688851356e5e89c64e0630bb06e1651bf8fe4d8f3b7" exitCode=0 Apr 20 14:34:24.725689 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:24.725231 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" event={"ID":"eee7cf26-2d64-4900-91a3-15977f726cc9","Type":"ContainerDied","Data":"cd8e284d9f3639be5110d688851356e5e89c64e0630bb06e1651bf8fe4d8f3b7"} Apr 20 14:34:25.729887 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:25.729856 2581 generic.go:358] "Generic (PLEG): container finished" podID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerID="1abe62dd9e3ac17ccb8a8e8e1f50b8ed5e55163ab8683397d68690e3d824fac3" exitCode=0 Apr 20 14:34:25.730280 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:25.729939 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" event={"ID":"eee7cf26-2d64-4900-91a3-15977f726cc9","Type":"ContainerDied","Data":"1abe62dd9e3ac17ccb8a8e8e1f50b8ed5e55163ab8683397d68690e3d824fac3"} Apr 20 14:34:25.952004 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:25.951969 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jc99p"] Apr 20 14:34:25.955002 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:25.954986 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:25.957660 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:25.957642 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 14:34:25.958513 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:25.958497 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-d6gfv\"" Apr 20 14:34:25.958580 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:25.958497 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 14:34:25.962402 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:25.962378 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jc99p"] Apr 20 14:34:26.059174 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.059103 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5wd\" (UniqueName: \"kubernetes.io/projected/a0376da7-349a-4ed5-a07d-8a6671b93fdd-kube-api-access-lw5wd\") pod \"cert-manager-webhook-597b96b99b-jc99p\" (UID: \"a0376da7-349a-4ed5-a07d-8a6671b93fdd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:26.059174 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.059157 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0376da7-349a-4ed5-a07d-8a6671b93fdd-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jc99p\" (UID: \"a0376da7-349a-4ed5-a07d-8a6671b93fdd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:26.159911 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.159886 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5wd\" (UniqueName: \"kubernetes.io/projected/a0376da7-349a-4ed5-a07d-8a6671b93fdd-kube-api-access-lw5wd\") pod \"cert-manager-webhook-597b96b99b-jc99p\" (UID: \"a0376da7-349a-4ed5-a07d-8a6671b93fdd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:26.160024 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.159935 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0376da7-349a-4ed5-a07d-8a6671b93fdd-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jc99p\" (UID: \"a0376da7-349a-4ed5-a07d-8a6671b93fdd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:26.168526 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.168506 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a0376da7-349a-4ed5-a07d-8a6671b93fdd-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jc99p\" (UID: \"a0376da7-349a-4ed5-a07d-8a6671b93fdd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:26.168661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.168639 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5wd\" (UniqueName: \"kubernetes.io/projected/a0376da7-349a-4ed5-a07d-8a6671b93fdd-kube-api-access-lw5wd\") pod \"cert-manager-webhook-597b96b99b-jc99p\" (UID: \"a0376da7-349a-4ed5-a07d-8a6671b93fdd\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:26.264956 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.264929 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:26.381233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.381205 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jc99p"] Apr 20 14:34:26.384580 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:34:26.384551 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0376da7_349a_4ed5_a07d_8a6671b93fdd.slice/crio-c38a5ea6b19648c9db4b6761b9592ee2f0bb6109b1d1a0c638559fd6b7e7263a WatchSource:0}: Error finding container c38a5ea6b19648c9db4b6761b9592ee2f0bb6109b1d1a0c638559fd6b7e7263a: Status 404 returned error can't find the container with id c38a5ea6b19648c9db4b6761b9592ee2f0bb6109b1d1a0c638559fd6b7e7263a Apr 20 14:34:26.733736 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.733701 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" event={"ID":"a0376da7-349a-4ed5-a07d-8a6671b93fdd","Type":"ContainerStarted","Data":"c38a5ea6b19648c9db4b6761b9592ee2f0bb6109b1d1a0c638559fd6b7e7263a"} Apr 20 14:34:26.838676 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.838656 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:26.866154 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.866133 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-bundle\") pod \"eee7cf26-2d64-4900-91a3-15977f726cc9\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " Apr 20 14:34:26.866262 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.866167 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp2r6\" (UniqueName: \"kubernetes.io/projected/eee7cf26-2d64-4900-91a3-15977f726cc9-kube-api-access-kp2r6\") pod \"eee7cf26-2d64-4900-91a3-15977f726cc9\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " Apr 20 14:34:26.866262 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.866214 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-util\") pod \"eee7cf26-2d64-4900-91a3-15977f726cc9\" (UID: \"eee7cf26-2d64-4900-91a3-15977f726cc9\") " Apr 20 14:34:26.866490 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.866459 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-bundle" (OuterVolumeSpecName: "bundle") pod "eee7cf26-2d64-4900-91a3-15977f726cc9" (UID: "eee7cf26-2d64-4900-91a3-15977f726cc9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:34:26.868613 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.868586 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eee7cf26-2d64-4900-91a3-15977f726cc9-kube-api-access-kp2r6" (OuterVolumeSpecName: "kube-api-access-kp2r6") pod "eee7cf26-2d64-4900-91a3-15977f726cc9" (UID: "eee7cf26-2d64-4900-91a3-15977f726cc9"). InnerVolumeSpecName "kube-api-access-kp2r6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:34:26.871332 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.871309 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-util" (OuterVolumeSpecName: "util") pod "eee7cf26-2d64-4900-91a3-15977f726cc9" (UID: "eee7cf26-2d64-4900-91a3-15977f726cc9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:34:26.967095 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.967065 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:26.967095 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.967092 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kp2r6\" (UniqueName: \"kubernetes.io/projected/eee7cf26-2d64-4900-91a3-15977f726cc9-kube-api-access-kp2r6\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:26.967242 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:26.967102 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/eee7cf26-2d64-4900-91a3-15977f726cc9-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:27.738594 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:27.738551 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" event={"ID":"eee7cf26-2d64-4900-91a3-15977f726cc9","Type":"ContainerDied","Data":"340f263c392ad49d91860e5b7f4c5baa7c9aaf8f34bd9a7e0da629217e22093f"} Apr 20 14:34:27.738594 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:27.738585 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffzhql" Apr 20 14:34:27.739122 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:27.738589 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="340f263c392ad49d91860e5b7f4c5baa7c9aaf8f34bd9a7e0da629217e22093f" Apr 20 14:34:29.748059 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:29.748017 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" event={"ID":"a0376da7-349a-4ed5-a07d-8a6671b93fdd","Type":"ContainerStarted","Data":"7e02ceb032bd68d01bda91266697865b9a3da253ae69a9a7cfb04d2961143bdd"} Apr 20 14:34:29.748449 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:29.748150 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:29.765883 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:29.765656 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" podStartSLOduration=2.055445666 podStartE2EDuration="4.765641581s" podCreationTimestamp="2026-04-20 14:34:25 +0000 UTC" firstStartedPulling="2026-04-20 14:34:26.386595187 +0000 UTC m=+446.566083633" lastFinishedPulling="2026-04-20 14:34:29.096791102 +0000 UTC m=+449.276279548" observedRunningTime="2026-04-20 14:34:29.765594786 +0000 UTC m=+449.945083267" watchObservedRunningTime="2026-04-20 14:34:29.765641581 +0000 UTC m=+449.945130049" Apr 20 14:34:35.753129 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:35.753103 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jc99p" Apr 20 14:34:38.252833 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.252796 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw"] Apr 20 14:34:38.253175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.253088 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerName="pull" Apr 20 14:34:38.253175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.253099 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerName="pull" Apr 20 14:34:38.253175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.253116 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerName="extract" Apr 20 14:34:38.253175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.253121 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerName="extract" Apr 20 14:34:38.253175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.253131 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerName="util" Apr 20 14:34:38.253175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.253150 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerName="util" Apr 20 14:34:38.253359 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.253191 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="eee7cf26-2d64-4900-91a3-15977f726cc9" containerName="extract" Apr 20 14:34:38.274780 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.274749 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw"] Apr 20 14:34:38.274907 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.274830 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.277782 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.277762 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:34:38.277894 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.277766 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:34:38.278572 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.278557 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-75g9b\"" Apr 20 14:34:38.348436 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.348409 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp6lt\" (UniqueName: \"kubernetes.io/projected/b5103830-7ce7-498d-9501-3724c56ceed6-kube-api-access-qp6lt\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.348541 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.348452 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.348541 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.348472 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.449071 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.449044 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp6lt\" (UniqueName: \"kubernetes.io/projected/b5103830-7ce7-498d-9501-3724c56ceed6-kube-api-access-qp6lt\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.449230 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.449210 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.449324 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.449251 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.449585 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.449567 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.449625 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.449582 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.457552 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.457531 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp6lt\" (UniqueName: \"kubernetes.io/projected/b5103830-7ce7-498d-9501-3724c56ceed6-kube-api-access-qp6lt\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.583541 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.583493 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:38.706299 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.706265 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw"] Apr 20 14:34:38.709162 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:34:38.709125 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5103830_7ce7_498d_9501_3724c56ceed6.slice/crio-dbe52638a3a3f935be73f59a8290d055fb7593cbd2a9eead08df2bfaecd88046 WatchSource:0}: Error finding container dbe52638a3a3f935be73f59a8290d055fb7593cbd2a9eead08df2bfaecd88046: Status 404 returned error can't find the container with id dbe52638a3a3f935be73f59a8290d055fb7593cbd2a9eead08df2bfaecd88046 Apr 20 14:34:38.778560 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:38.778533 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" event={"ID":"b5103830-7ce7-498d-9501-3724c56ceed6","Type":"ContainerStarted","Data":"dbe52638a3a3f935be73f59a8290d055fb7593cbd2a9eead08df2bfaecd88046"} Apr 20 14:34:39.782650 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:39.782619 2581 generic.go:358] "Generic (PLEG): container finished" podID="b5103830-7ce7-498d-9501-3724c56ceed6" containerID="16f3d11411c174d18bad26e66a3bac02695b03e56e8812cb09d905b1226b62e1" exitCode=0 Apr 20 14:34:39.783023 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:39.782669 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" event={"ID":"b5103830-7ce7-498d-9501-3724c56ceed6","Type":"ContainerDied","Data":"16f3d11411c174d18bad26e66a3bac02695b03e56e8812cb09d905b1226b62e1"} Apr 20 14:34:40.787884 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:40.787848 2581 generic.go:358] "Generic (PLEG): container finished" podID="b5103830-7ce7-498d-9501-3724c56ceed6" containerID="3379a1fb5eb22e60b882c59b77d6c78f1919f24b435a01bae8f9899bf8798d1e" exitCode=0 Apr 20 14:34:40.788262 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:40.787947 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" event={"ID":"b5103830-7ce7-498d-9501-3724c56ceed6","Type":"ContainerDied","Data":"3379a1fb5eb22e60b882c59b77d6c78f1919f24b435a01bae8f9899bf8798d1e"} Apr 20 14:34:41.792708 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:41.792663 2581 generic.go:358] "Generic (PLEG): container finished" podID="b5103830-7ce7-498d-9501-3724c56ceed6" containerID="0b6d47d7256ea14472a7183cce7463cf30e64ab3ca0803aaf1c544fa3230ae09" exitCode=0 Apr 20 14:34:41.793105 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:41.792716 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" event={"ID":"b5103830-7ce7-498d-9501-3724c56ceed6","Type":"ContainerDied","Data":"0b6d47d7256ea14472a7183cce7463cf30e64ab3ca0803aaf1c544fa3230ae09"} Apr 20 14:34:42.619739 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.619692 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-jwzv9"] Apr 20 14:34:42.622824 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.622808 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-jwzv9" Apr 20 14:34:42.625474 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.625454 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-mpf8r\"" Apr 20 14:34:42.632168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.632144 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-jwzv9"] Apr 20 14:34:42.681588 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.681563 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j94gb\" (UniqueName: \"kubernetes.io/projected/d8b40831-5e06-4022-a448-b1ab7cb75c23-kube-api-access-j94gb\") pod \"cert-manager-759f64656b-jwzv9\" (UID: \"d8b40831-5e06-4022-a448-b1ab7cb75c23\") " pod="cert-manager/cert-manager-759f64656b-jwzv9" Apr 20 14:34:42.681679 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.681607 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8b40831-5e06-4022-a448-b1ab7cb75c23-bound-sa-token\") pod \"cert-manager-759f64656b-jwzv9\" (UID: \"d8b40831-5e06-4022-a448-b1ab7cb75c23\") " pod="cert-manager/cert-manager-759f64656b-jwzv9" Apr 20 14:34:42.782789 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.782765 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j94gb\" (UniqueName: \"kubernetes.io/projected/d8b40831-5e06-4022-a448-b1ab7cb75c23-kube-api-access-j94gb\") pod \"cert-manager-759f64656b-jwzv9\" (UID: \"d8b40831-5e06-4022-a448-b1ab7cb75c23\") " pod="cert-manager/cert-manager-759f64656b-jwzv9" Apr 20 14:34:42.782889 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.782808 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8b40831-5e06-4022-a448-b1ab7cb75c23-bound-sa-token\") pod \"cert-manager-759f64656b-jwzv9\" (UID: \"d8b40831-5e06-4022-a448-b1ab7cb75c23\") " pod="cert-manager/cert-manager-759f64656b-jwzv9" Apr 20 14:34:42.791295 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.791267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d8b40831-5e06-4022-a448-b1ab7cb75c23-bound-sa-token\") pod \"cert-manager-759f64656b-jwzv9\" (UID: \"d8b40831-5e06-4022-a448-b1ab7cb75c23\") " pod="cert-manager/cert-manager-759f64656b-jwzv9" Apr 20 14:34:42.791506 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.791487 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j94gb\" (UniqueName: \"kubernetes.io/projected/d8b40831-5e06-4022-a448-b1ab7cb75c23-kube-api-access-j94gb\") pod \"cert-manager-759f64656b-jwzv9\" (UID: \"d8b40831-5e06-4022-a448-b1ab7cb75c23\") " pod="cert-manager/cert-manager-759f64656b-jwzv9" Apr 20 14:34:42.918451 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.918433 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:42.931342 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.931312 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-jwzv9" Apr 20 14:34:42.983958 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.983931 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-util\") pod \"b5103830-7ce7-498d-9501-3724c56ceed6\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " Apr 20 14:34:42.984079 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.983996 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-bundle\") pod \"b5103830-7ce7-498d-9501-3724c56ceed6\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " Apr 20 14:34:42.984079 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.984073 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp6lt\" (UniqueName: \"kubernetes.io/projected/b5103830-7ce7-498d-9501-3724c56ceed6-kube-api-access-qp6lt\") pod \"b5103830-7ce7-498d-9501-3724c56ceed6\" (UID: \"b5103830-7ce7-498d-9501-3724c56ceed6\") " Apr 20 14:34:42.985117 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.984709 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-bundle" (OuterVolumeSpecName: "bundle") pod "b5103830-7ce7-498d-9501-3724c56ceed6" (UID: "b5103830-7ce7-498d-9501-3724c56ceed6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:34:42.986767 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.986709 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5103830-7ce7-498d-9501-3724c56ceed6-kube-api-access-qp6lt" (OuterVolumeSpecName: "kube-api-access-qp6lt") pod "b5103830-7ce7-498d-9501-3724c56ceed6" (UID: "b5103830-7ce7-498d-9501-3724c56ceed6"). InnerVolumeSpecName "kube-api-access-qp6lt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:34:42.993008 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:42.992977 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-util" (OuterVolumeSpecName: "util") pod "b5103830-7ce7-498d-9501-3724c56ceed6" (UID: "b5103830-7ce7-498d-9501-3724c56ceed6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:34:43.060532 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.060502 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-jwzv9"] Apr 20 14:34:43.063310 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:34:43.063281 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8b40831_5e06_4022_a448_b1ab7cb75c23.slice/crio-2101feecc1cb39264325e5f8b6cd96a2e83d24a301aeb903c80a8502a23f498d WatchSource:0}: Error finding container 2101feecc1cb39264325e5f8b6cd96a2e83d24a301aeb903c80a8502a23f498d: Status 404 returned error can't find the container with id 2101feecc1cb39264325e5f8b6cd96a2e83d24a301aeb903c80a8502a23f498d Apr 20 14:34:43.084886 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.084857 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qp6lt\" (UniqueName: \"kubernetes.io/projected/b5103830-7ce7-498d-9501-3724c56ceed6-kube-api-access-qp6lt\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:43.084886 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.084881 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:43.084886 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.084890 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5103830-7ce7-498d-9501-3724c56ceed6-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:43.800699 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.800676 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" Apr 20 14:34:43.800849 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.800705 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c564jsw" event={"ID":"b5103830-7ce7-498d-9501-3724c56ceed6","Type":"ContainerDied","Data":"dbe52638a3a3f935be73f59a8290d055fb7593cbd2a9eead08df2bfaecd88046"} Apr 20 14:34:43.800849 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.800754 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe52638a3a3f935be73f59a8290d055fb7593cbd2a9eead08df2bfaecd88046" Apr 20 14:34:43.802055 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.802034 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-jwzv9" event={"ID":"d8b40831-5e06-4022-a448-b1ab7cb75c23","Type":"ContainerStarted","Data":"f675b991d8d1fc5be8cf515f187697d47a6a84769a180f1863c86f0fc51f7ab7"} Apr 20 14:34:43.802152 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.802059 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-jwzv9" event={"ID":"d8b40831-5e06-4022-a448-b1ab7cb75c23","Type":"ContainerStarted","Data":"2101feecc1cb39264325e5f8b6cd96a2e83d24a301aeb903c80a8502a23f498d"} Apr 20 14:34:43.824526 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:43.824464 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-jwzv9" podStartSLOduration=1.824452159 podStartE2EDuration="1.824452159s" podCreationTimestamp="2026-04-20 14:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:34:43.822347836 +0000 UTC m=+464.001836303" watchObservedRunningTime="2026-04-20 14:34:43.824452159 +0000 UTC m=+464.003940627" Apr 20 14:34:54.902799 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.902763 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8"] Apr 20 14:34:54.903248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.903025 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5103830-7ce7-498d-9501-3724c56ceed6" containerName="util" Apr 20 14:34:54.903248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.903035 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5103830-7ce7-498d-9501-3724c56ceed6" containerName="util" Apr 20 14:34:54.903248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.903042 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5103830-7ce7-498d-9501-3724c56ceed6" containerName="extract" Apr 20 14:34:54.903248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.903048 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5103830-7ce7-498d-9501-3724c56ceed6" containerName="extract" Apr 20 14:34:54.903248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.903059 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5103830-7ce7-498d-9501-3724c56ceed6" containerName="pull" Apr 20 14:34:54.903248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.903064 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5103830-7ce7-498d-9501-3724c56ceed6" containerName="pull" Apr 20 14:34:54.903248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.903110 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5103830-7ce7-498d-9501-3724c56ceed6" containerName="extract" Apr 20 14:34:54.906230 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.906214 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:54.908890 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.908870 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:34:54.909947 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.909919 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:34:54.910040 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.910012 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-75g9b\"" Apr 20 14:34:54.916200 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.916179 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8"] Apr 20 14:34:54.963588 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.963563 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vkrs\" (UniqueName: \"kubernetes.io/projected/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-kube-api-access-2vkrs\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:54.963682 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.963593 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:54.963682 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:54.963635 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:55.064923 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.064891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vkrs\" (UniqueName: \"kubernetes.io/projected/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-kube-api-access-2vkrs\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:55.064923 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.064925 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:55.065082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.064969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:55.065320 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.065305 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:55.065366 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.065352 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:55.076991 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.076961 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vkrs\" (UniqueName: \"kubernetes.io/projected/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-kube-api-access-2vkrs\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:55.215192 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.215165 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:55.353696 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.353663 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8"] Apr 20 14:34:55.356283 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:34:55.356257 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0ee4815_f0d1_41e5_9443_e5ddf0a1f490.slice/crio-83149010122c924108a8cb0cf8d56ef35dd0f19fb1643d714295dec8f048122e WatchSource:0}: Error finding container 83149010122c924108a8cb0cf8d56ef35dd0f19fb1643d714295dec8f048122e: Status 404 returned error can't find the container with id 83149010122c924108a8cb0cf8d56ef35dd0f19fb1643d714295dec8f048122e Apr 20 14:34:55.839859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.839833 2581 generic.go:358] "Generic (PLEG): container finished" podID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerID="312dc8357d0bda9b6b33cbf377c81fcd60d6c41a8ebb1d57cbb87d67329bbc5a" exitCode=0 Apr 20 14:34:55.840015 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.839883 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" event={"ID":"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490","Type":"ContainerDied","Data":"312dc8357d0bda9b6b33cbf377c81fcd60d6c41a8ebb1d57cbb87d67329bbc5a"} Apr 20 14:34:55.840015 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.839906 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" event={"ID":"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490","Type":"ContainerStarted","Data":"83149010122c924108a8cb0cf8d56ef35dd0f19fb1643d714295dec8f048122e"} Apr 20 14:34:55.852854 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.852829 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28"] Apr 20 14:34:55.856111 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.856092 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:55.858147 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.858129 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 14:34:55.858519 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.858504 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-wjn8h\"" Apr 20 14:34:55.859597 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.859574 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 14:34:55.859702 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.859678 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 14:34:55.859783 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.859702 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 14:34:55.871784 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.871762 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28"] Apr 20 14:34:55.971697 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.971669 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:55.972029 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.971780 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:55.972029 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:55.971799 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4qq\" (UniqueName: \"kubernetes.io/projected/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-kube-api-access-gm4qq\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:56.073009 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.072983 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:56.073009 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.073011 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4qq\" (UniqueName: \"kubernetes.io/projected/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-kube-api-access-gm4qq\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:56.073143 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.073040 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:56.075444 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.075427 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:56.075544 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.075464 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:56.081594 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.081573 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4qq\" (UniqueName: \"kubernetes.io/projected/b65f99ff-f35b-4eb0-bb2f-54ff0e147fab-kube-api-access-gm4qq\") pod \"opendatahub-operator-controller-manager-65c545df94-ssz28\" (UID: \"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:56.165882 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.165821 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:56.288188 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.288109 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28"] Apr 20 14:34:56.290569 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:34:56.290544 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb65f99ff_f35b_4eb0_bb2f_54ff0e147fab.slice/crio-f98761f1fee607d61881c3f92b74304c7119e0e329d7a8860c2b355d28830ce3 WatchSource:0}: Error finding container f98761f1fee607d61881c3f92b74304c7119e0e329d7a8860c2b355d28830ce3: Status 404 returned error can't find the container with id f98761f1fee607d61881c3f92b74304c7119e0e329d7a8860c2b355d28830ce3 Apr 20 14:34:56.848073 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.848044 2581 generic.go:358] "Generic (PLEG): container finished" podID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerID="11ec58dfbcdf74cff7b7a88c03294dfdea2f24aae98c667447c6ee7e3fde80b2" exitCode=0 Apr 20 14:34:56.848264 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.848118 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" event={"ID":"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490","Type":"ContainerDied","Data":"11ec58dfbcdf74cff7b7a88c03294dfdea2f24aae98c667447c6ee7e3fde80b2"} Apr 20 14:34:56.849311 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:56.849287 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" event={"ID":"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab","Type":"ContainerStarted","Data":"f98761f1fee607d61881c3f92b74304c7119e0e329d7a8860c2b355d28830ce3"} Apr 20 14:34:57.854957 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:57.854921 2581 generic.go:358] "Generic (PLEG): container finished" podID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerID="72a7355a5658dde17b659329e2aa7724a859d472e5c67c648f9b320ab9099620" exitCode=0 Apr 20 14:34:57.855341 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:57.855010 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" event={"ID":"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490","Type":"ContainerDied","Data":"72a7355a5658dde17b659329e2aa7724a859d472e5c67c648f9b320ab9099620"} Apr 20 14:34:58.859515 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:58.859478 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" event={"ID":"b65f99ff-f35b-4eb0-bb2f-54ff0e147fab","Type":"ContainerStarted","Data":"fc537aaf740f1547f37791c548b001e66ef33b5f99b8e16b30789e5ede46b40a"} Apr 20 14:34:58.859886 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:58.859632 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:34:58.880879 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:58.880817 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" podStartSLOduration=1.402435489 podStartE2EDuration="3.880799521s" podCreationTimestamp="2026-04-20 14:34:55 +0000 UTC" firstStartedPulling="2026-04-20 14:34:56.292461419 +0000 UTC m=+476.471949866" lastFinishedPulling="2026-04-20 14:34:58.770825435 +0000 UTC m=+478.950313898" observedRunningTime="2026-04-20 14:34:58.878656908 +0000 UTC m=+479.058145378" watchObservedRunningTime="2026-04-20 14:34:58.880799521 +0000 UTC m=+479.060287990" Apr 20 14:34:58.984926 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:58.984905 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:59.100598 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.100533 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vkrs\" (UniqueName: \"kubernetes.io/projected/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-kube-api-access-2vkrs\") pod \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " Apr 20 14:34:59.100598 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.100565 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-bundle\") pod \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " Apr 20 14:34:59.100598 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.100587 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-util\") pod \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\" (UID: \"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490\") " Apr 20 14:34:59.101515 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.101489 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-bundle" (OuterVolumeSpecName: "bundle") pod "b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" (UID: "b0ee4815-f0d1-41e5-9443-e5ddf0a1f490"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:34:59.102792 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.102770 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-kube-api-access-2vkrs" (OuterVolumeSpecName: "kube-api-access-2vkrs") pod "b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" (UID: "b0ee4815-f0d1-41e5-9443-e5ddf0a1f490"). InnerVolumeSpecName "kube-api-access-2vkrs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:34:59.105962 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.105942 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-util" (OuterVolumeSpecName: "util") pod "b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" (UID: "b0ee4815-f0d1-41e5-9443-e5ddf0a1f490"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:34:59.201403 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.201379 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vkrs\" (UniqueName: \"kubernetes.io/projected/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-kube-api-access-2vkrs\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:59.201403 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.201403 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:59.201522 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.201412 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0ee4815-f0d1-41e5-9443-e5ddf0a1f490-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:34:59.864557 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.864525 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" Apr 20 14:34:59.864557 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.864536 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qqtg8" event={"ID":"b0ee4815-f0d1-41e5-9443-e5ddf0a1f490","Type":"ContainerDied","Data":"83149010122c924108a8cb0cf8d56ef35dd0f19fb1643d714295dec8f048122e"} Apr 20 14:34:59.865097 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:34:59.864571 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83149010122c924108a8cb0cf8d56ef35dd0f19fb1643d714295dec8f048122e" Apr 20 14:35:09.867524 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:09.867487 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-ssz28" Apr 20 14:35:14.588858 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.588824 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-87db58fcf-kth4n"] Apr 20 14:35:14.589233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.589092 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerName="pull" Apr 20 14:35:14.589233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.589103 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerName="pull" Apr 20 14:35:14.589233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.589123 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerName="extract" Apr 20 14:35:14.589233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.589129 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerName="extract" Apr 20 14:35:14.589233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.589137 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerName="util" Apr 20 14:35:14.589233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.589144 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerName="util" Apr 20 14:35:14.589233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.589191 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0ee4815-f0d1-41e5-9443-e5ddf0a1f490" containerName="extract" Apr 20 14:35:14.592271 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.592254 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.595184 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.595160 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 14:35:14.596150 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.596123 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-sq2fv\"" Apr 20 14:35:14.596150 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.596143 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 14:35:14.604529 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.604505 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-87db58fcf-kth4n"] Apr 20 14:35:14.694204 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.694176 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9"] Apr 20 14:35:14.697455 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.697441 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.700301 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.700275 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:35:14.700428 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.700399 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-75g9b\"" Apr 20 14:35:14.700487 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.700474 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:35:14.707052 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.707030 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9"] Apr 20 14:35:14.714780 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.714758 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2ss\" (UniqueName: \"kubernetes.io/projected/cc9fc9e7-8226-482c-a827-42039d4cc2b3-kube-api-access-fp2ss\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.714868 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.714803 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc9fc9e7-8226-482c-a827-42039d4cc2b3-tls-certs\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.714868 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.714854 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc9fc9e7-8226-482c-a827-42039d4cc2b3-tmp\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.815469 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.815443 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc9fc9e7-8226-482c-a827-42039d4cc2b3-tls-certs\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.815579 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.815494 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc9fc9e7-8226-482c-a827-42039d4cc2b3-tmp\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.815579 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.815536 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfwvl\" (UniqueName: \"kubernetes.io/projected/927b5c77-2f16-4d97-979c-db11fb250bab-kube-api-access-tfwvl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.815579 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.815563 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.815719 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.815604 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2ss\" (UniqueName: \"kubernetes.io/projected/cc9fc9e7-8226-482c-a827-42039d4cc2b3-kube-api-access-fp2ss\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.815719 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.815637 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.817962 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.817946 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc9fc9e7-8226-482c-a827-42039d4cc2b3-tmp\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.818051 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.818030 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cc9fc9e7-8226-482c-a827-42039d4cc2b3-tls-certs\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.824392 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.824375 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2ss\" (UniqueName: \"kubernetes.io/projected/cc9fc9e7-8226-482c-a827-42039d4cc2b3-kube-api-access-fp2ss\") pod \"kube-auth-proxy-87db58fcf-kth4n\" (UID: \"cc9fc9e7-8226-482c-a827-42039d4cc2b3\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.902242 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.902178 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" Apr 20 14:35:14.916189 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.916161 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.916287 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.916275 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfwvl\" (UniqueName: \"kubernetes.io/projected/927b5c77-2f16-4d97-979c-db11fb250bab-kube-api-access-tfwvl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.916338 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.916306 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.916533 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.916515 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.916619 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.916601 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:14.936673 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:14.936646 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfwvl\" (UniqueName: \"kubernetes.io/projected/927b5c77-2f16-4d97-979c-db11fb250bab-kube-api-access-tfwvl\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:15.006990 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:15.006961 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:15.023245 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:15.023193 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-87db58fcf-kth4n"] Apr 20 14:35:15.026101 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:35:15.025891 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc9fc9e7_8226_482c_a827_42039d4cc2b3.slice/crio-e0bf1bf6e74aff3f0b69cc7ebfa4936709368f79d635216ebc26fa46c677a5ca WatchSource:0}: Error finding container e0bf1bf6e74aff3f0b69cc7ebfa4936709368f79d635216ebc26fa46c677a5ca: Status 404 returned error can't find the container with id e0bf1bf6e74aff3f0b69cc7ebfa4936709368f79d635216ebc26fa46c677a5ca Apr 20 14:35:15.128091 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:15.128069 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9"] Apr 20 14:35:15.129315 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:35:15.129287 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod927b5c77_2f16_4d97_979c_db11fb250bab.slice/crio-55221e9d1615d004fde9f9d4a4193f1fb4dd480f6c7a23b30cc66330f4e9d92d WatchSource:0}: Error finding container 55221e9d1615d004fde9f9d4a4193f1fb4dd480f6c7a23b30cc66330f4e9d92d: Status 404 returned error can't find the container with id 55221e9d1615d004fde9f9d4a4193f1fb4dd480f6c7a23b30cc66330f4e9d92d Apr 20 14:35:15.916931 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:15.916878 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" event={"ID":"cc9fc9e7-8226-482c-a827-42039d4cc2b3","Type":"ContainerStarted","Data":"e0bf1bf6e74aff3f0b69cc7ebfa4936709368f79d635216ebc26fa46c677a5ca"} Apr 20 14:35:15.918655 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:15.918608 2581 generic.go:358] "Generic (PLEG): container finished" podID="927b5c77-2f16-4d97-979c-db11fb250bab" containerID="76f6ce4bafdf44eff421f4b50a074584e47b32a71af2f761e606400ac0afd6ee" exitCode=0 Apr 20 14:35:15.918805 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:15.918658 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" event={"ID":"927b5c77-2f16-4d97-979c-db11fb250bab","Type":"ContainerDied","Data":"76f6ce4bafdf44eff421f4b50a074584e47b32a71af2f761e606400ac0afd6ee"} Apr 20 14:35:15.918805 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:15.918678 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" event={"ID":"927b5c77-2f16-4d97-979c-db11fb250bab","Type":"ContainerStarted","Data":"55221e9d1615d004fde9f9d4a4193f1fb4dd480f6c7a23b30cc66330f4e9d92d"} Apr 20 14:35:16.810613 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:16.810576 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-nkhrj"] Apr 20 14:35:16.821862 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:16.821835 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:16.823787 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:16.823739 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-nkhrj"] Apr 20 14:35:16.824496 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:16.824474 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 14:35:16.824814 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:16.824792 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-nl64x\"" Apr 20 14:35:16.931784 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:16.931752 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rd7w\" (UniqueName: \"kubernetes.io/projected/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-kube-api-access-2rd7w\") pod \"odh-model-controller-858dbf95b8-nkhrj\" (UID: \"ba024e5e-b9fa-4ab8-bb53-1fe317a59889\") " pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:16.932119 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:16.931819 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert\") pod \"odh-model-controller-858dbf95b8-nkhrj\" (UID: \"ba024e5e-b9fa-4ab8-bb53-1fe317a59889\") " pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:17.033102 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:17.033070 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rd7w\" (UniqueName: \"kubernetes.io/projected/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-kube-api-access-2rd7w\") pod \"odh-model-controller-858dbf95b8-nkhrj\" (UID: \"ba024e5e-b9fa-4ab8-bb53-1fe317a59889\") " pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:17.033232 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:17.033124 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert\") pod \"odh-model-controller-858dbf95b8-nkhrj\" (UID: \"ba024e5e-b9fa-4ab8-bb53-1fe317a59889\") " pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:17.033232 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:35:17.033204 2581 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 14:35:17.033297 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:35:17.033254 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert podName:ba024e5e-b9fa-4ab8-bb53-1fe317a59889 nodeName:}" failed. No retries permitted until 2026-04-20 14:35:17.533238033 +0000 UTC m=+497.712726483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert") pod "odh-model-controller-858dbf95b8-nkhrj" (UID: "ba024e5e-b9fa-4ab8-bb53-1fe317a59889") : secret "odh-model-controller-webhook-cert" not found Apr 20 14:35:17.042415 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:17.042395 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rd7w\" (UniqueName: \"kubernetes.io/projected/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-kube-api-access-2rd7w\") pod \"odh-model-controller-858dbf95b8-nkhrj\" (UID: \"ba024e5e-b9fa-4ab8-bb53-1fe317a59889\") " pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:17.538386 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:17.538352 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert\") pod \"odh-model-controller-858dbf95b8-nkhrj\" (UID: \"ba024e5e-b9fa-4ab8-bb53-1fe317a59889\") " pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:17.538566 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:35:17.538501 2581 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 14:35:17.538631 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:35:17.538570 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert podName:ba024e5e-b9fa-4ab8-bb53-1fe317a59889 nodeName:}" failed. No retries permitted until 2026-04-20 14:35:18.538549311 +0000 UTC m=+498.718037763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert") pod "odh-model-controller-858dbf95b8-nkhrj" (UID: "ba024e5e-b9fa-4ab8-bb53-1fe317a59889") : secret "odh-model-controller-webhook-cert" not found Apr 20 14:35:17.929089 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:17.928999 2581 generic.go:358] "Generic (PLEG): container finished" podID="927b5c77-2f16-4d97-979c-db11fb250bab" containerID="0deee41eb8ca5212bc9cca6bbbd65e46d97fa13371c5742c74fb9d67f2c94c44" exitCode=0 Apr 20 14:35:17.929089 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:17.929071 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" event={"ID":"927b5c77-2f16-4d97-979c-db11fb250bab","Type":"ContainerDied","Data":"0deee41eb8ca5212bc9cca6bbbd65e46d97fa13371c5742c74fb9d67f2c94c44"} Apr 20 14:35:18.549175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:18.549132 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert\") pod \"odh-model-controller-858dbf95b8-nkhrj\" (UID: \"ba024e5e-b9fa-4ab8-bb53-1fe317a59889\") " pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:18.552139 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:18.552111 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba024e5e-b9fa-4ab8-bb53-1fe317a59889-cert\") pod \"odh-model-controller-858dbf95b8-nkhrj\" (UID: \"ba024e5e-b9fa-4ab8-bb53-1fe317a59889\") " pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:18.637447 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:18.637412 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:18.935456 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:18.935378 2581 generic.go:358] "Generic (PLEG): container finished" podID="927b5c77-2f16-4d97-979c-db11fb250bab" containerID="66cabbb3f7e0c21e103dd3fa8309d554772dc9f468b9e7538a2f29c712764bae" exitCode=0 Apr 20 14:35:18.935607 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:18.935461 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" event={"ID":"927b5c77-2f16-4d97-979c-db11fb250bab","Type":"ContainerDied","Data":"66cabbb3f7e0c21e103dd3fa8309d554772dc9f468b9e7538a2f29c712764bae"} Apr 20 14:35:19.185701 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:19.185650 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-nkhrj"] Apr 20 14:35:19.187459 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:35:19.187423 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba024e5e_b9fa_4ab8_bb53_1fe317a59889.slice/crio-1c441cea257a363dc3110a06f539a3e0c03ead1e9e01ae64bc71c43db45095d0 WatchSource:0}: Error finding container 1c441cea257a363dc3110a06f539a3e0c03ead1e9e01ae64bc71c43db45095d0: Status 404 returned error can't find the container with id 1c441cea257a363dc3110a06f539a3e0c03ead1e9e01ae64bc71c43db45095d0 Apr 20 14:35:19.940877 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:19.940836 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" event={"ID":"ba024e5e-b9fa-4ab8-bb53-1fe317a59889","Type":"ContainerStarted","Data":"1c441cea257a363dc3110a06f539a3e0c03ead1e9e01ae64bc71c43db45095d0"} Apr 20 14:35:19.943244 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:19.943199 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" event={"ID":"cc9fc9e7-8226-482c-a827-42039d4cc2b3","Type":"ContainerStarted","Data":"1e88b185c531abd7e230cbb486ec8643f6de4df6ef3aa7ad5ee38f088d53c0a6"} Apr 20 14:35:19.962304 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:19.962249 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-87db58fcf-kth4n" podStartSLOduration=1.866149742 podStartE2EDuration="5.962229542s" podCreationTimestamp="2026-04-20 14:35:14 +0000 UTC" firstStartedPulling="2026-04-20 14:35:15.027768613 +0000 UTC m=+495.207257059" lastFinishedPulling="2026-04-20 14:35:19.123848409 +0000 UTC m=+499.303336859" observedRunningTime="2026-04-20 14:35:19.959778161 +0000 UTC m=+500.139266629" watchObservedRunningTime="2026-04-20 14:35:19.962229542 +0000 UTC m=+500.141718014" Apr 20 14:35:20.115037 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.115012 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:20.262928 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.262896 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfwvl\" (UniqueName: \"kubernetes.io/projected/927b5c77-2f16-4d97-979c-db11fb250bab-kube-api-access-tfwvl\") pod \"927b5c77-2f16-4d97-979c-db11fb250bab\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " Apr 20 14:35:20.263103 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.262961 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-util\") pod \"927b5c77-2f16-4d97-979c-db11fb250bab\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " Apr 20 14:35:20.263103 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.263083 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-bundle\") pod \"927b5c77-2f16-4d97-979c-db11fb250bab\" (UID: \"927b5c77-2f16-4d97-979c-db11fb250bab\") " Apr 20 14:35:20.264205 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.264161 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-bundle" (OuterVolumeSpecName: "bundle") pod "927b5c77-2f16-4d97-979c-db11fb250bab" (UID: "927b5c77-2f16-4d97-979c-db11fb250bab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:35:20.265389 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.265362 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/927b5c77-2f16-4d97-979c-db11fb250bab-kube-api-access-tfwvl" (OuterVolumeSpecName: "kube-api-access-tfwvl") pod "927b5c77-2f16-4d97-979c-db11fb250bab" (UID: "927b5c77-2f16-4d97-979c-db11fb250bab"). InnerVolumeSpecName "kube-api-access-tfwvl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:35:20.269956 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.269918 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-util" (OuterVolumeSpecName: "util") pod "927b5c77-2f16-4d97-979c-db11fb250bab" (UID: "927b5c77-2f16-4d97-979c-db11fb250bab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:35:20.363876 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.363852 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:35:20.363876 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.363875 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/927b5c77-2f16-4d97-979c-db11fb250bab-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:35:20.364050 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.363885 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfwvl\" (UniqueName: \"kubernetes.io/projected/927b5c77-2f16-4d97-979c-db11fb250bab-kube-api-access-tfwvl\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:35:20.948992 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.948946 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" event={"ID":"927b5c77-2f16-4d97-979c-db11fb250bab","Type":"ContainerDied","Data":"55221e9d1615d004fde9f9d4a4193f1fb4dd480f6c7a23b30cc66330f4e9d92d"} Apr 20 14:35:20.949482 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.949006 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55221e9d1615d004fde9f9d4a4193f1fb4dd480f6c7a23b30cc66330f4e9d92d" Apr 20 14:35:20.949482 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:20.948969 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48358dms9" Apr 20 14:35:22.657466 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.657432 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bs64c"] Apr 20 14:35:22.657859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.657733 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="927b5c77-2f16-4d97-979c-db11fb250bab" containerName="util" Apr 20 14:35:22.657859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.657746 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="927b5c77-2f16-4d97-979c-db11fb250bab" containerName="util" Apr 20 14:35:22.657859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.657753 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="927b5c77-2f16-4d97-979c-db11fb250bab" containerName="pull" Apr 20 14:35:22.657859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.657758 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="927b5c77-2f16-4d97-979c-db11fb250bab" containerName="pull" Apr 20 14:35:22.657859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.657773 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="927b5c77-2f16-4d97-979c-db11fb250bab" containerName="extract" Apr 20 14:35:22.657859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.657778 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="927b5c77-2f16-4d97-979c-db11fb250bab" containerName="extract" Apr 20 14:35:22.657859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.657828 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="927b5c77-2f16-4d97-979c-db11fb250bab" containerName="extract" Apr 20 14:35:22.660642 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.660628 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:22.664438 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.664415 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-dm8ws\"" Apr 20 14:35:22.664548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.664423 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 14:35:22.684025 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.683999 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bs64c"] Apr 20 14:35:22.785013 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.784991 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-cert\") pod \"kserve-controller-manager-856948b99f-bs64c\" (UID: \"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:22.785136 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.785068 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24tj\" (UniqueName: \"kubernetes.io/projected/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-kube-api-access-g24tj\") pod \"kserve-controller-manager-856948b99f-bs64c\" (UID: \"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:22.886214 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.886188 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g24tj\" (UniqueName: \"kubernetes.io/projected/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-kube-api-access-g24tj\") pod \"kserve-controller-manager-856948b99f-bs64c\" (UID: \"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:22.886327 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.886231 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-cert\") pod \"kserve-controller-manager-856948b99f-bs64c\" (UID: \"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:22.886371 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:35:22.886360 2581 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 14:35:22.886425 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:35:22.886415 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-cert podName:b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41 nodeName:}" failed. No retries permitted until 2026-04-20 14:35:23.386396929 +0000 UTC m=+503.565885631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-cert") pod "kserve-controller-manager-856948b99f-bs64c" (UID: "b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41") : secret "kserve-webhook-server-cert" not found Apr 20 14:35:22.901528 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.901502 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24tj\" (UniqueName: \"kubernetes.io/projected/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-kube-api-access-g24tj\") pod \"kserve-controller-manager-856948b99f-bs64c\" (UID: \"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:22.960678 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.960618 2581 generic.go:358] "Generic (PLEG): container finished" podID="ba024e5e-b9fa-4ab8-bb53-1fe317a59889" containerID="cdacdf43c6cd744aa280e6454d921ca2a2ed8ec51efceda6f40ed71091c624e1" exitCode=1 Apr 20 14:35:22.960678 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.960657 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" event={"ID":"ba024e5e-b9fa-4ab8-bb53-1fe317a59889","Type":"ContainerDied","Data":"cdacdf43c6cd744aa280e6454d921ca2a2ed8ec51efceda6f40ed71091c624e1"} Apr 20 14:35:22.960915 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:22.960900 2581 scope.go:117] "RemoveContainer" containerID="cdacdf43c6cd744aa280e6454d921ca2a2ed8ec51efceda6f40ed71091c624e1" Apr 20 14:35:23.391672 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:23.391643 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-cert\") pod \"kserve-controller-manager-856948b99f-bs64c\" (UID: \"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:23.394139 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:23.394116 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41-cert\") pod \"kserve-controller-manager-856948b99f-bs64c\" (UID: \"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41\") " pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:23.570115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:23.570051 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:23.690165 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:23.690138 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-bs64c"] Apr 20 14:35:23.691805 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:35:23.691771 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ad245f_c49d_4e9e_9e02_d7e5dc0e8c41.slice/crio-de6e27381a599ffe95e9125ae9b0a5a1fcef1743c65d00f997f17bb5aff59429 WatchSource:0}: Error finding container de6e27381a599ffe95e9125ae9b0a5a1fcef1743c65d00f997f17bb5aff59429: Status 404 returned error can't find the container with id de6e27381a599ffe95e9125ae9b0a5a1fcef1743c65d00f997f17bb5aff59429 Apr 20 14:35:23.965382 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:23.965349 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" event={"ID":"ba024e5e-b9fa-4ab8-bb53-1fe317a59889","Type":"ContainerStarted","Data":"2ece8856e9679948874045b8460cf789ae084a855704c0e52b108c890ce3396a"} Apr 20 14:35:23.965576 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:23.965477 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:23.966548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:23.966526 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" event={"ID":"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41","Type":"ContainerStarted","Data":"de6e27381a599ffe95e9125ae9b0a5a1fcef1743c65d00f997f17bb5aff59429"} Apr 20 14:35:23.990336 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:23.990285 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" podStartSLOduration=3.882379541 podStartE2EDuration="7.990270841s" podCreationTimestamp="2026-04-20 14:35:16 +0000 UTC" firstStartedPulling="2026-04-20 14:35:19.188824826 +0000 UTC m=+499.368313275" lastFinishedPulling="2026-04-20 14:35:23.296716129 +0000 UTC m=+503.476204575" observedRunningTime="2026-04-20 14:35:23.988154502 +0000 UTC m=+504.167642970" watchObservedRunningTime="2026-04-20 14:35:23.990270841 +0000 UTC m=+504.169759312" Apr 20 14:35:27.988740 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:27.988683 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" event={"ID":"b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41","Type":"ContainerStarted","Data":"f40c0992187b87c2f3b444e4cc54c87aef38a13d9c439e7bc7caa48c155ae202"} Apr 20 14:35:27.989172 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:27.988870 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:35:28.025373 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:28.025325 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" podStartSLOduration=2.717176099 podStartE2EDuration="6.025310471s" podCreationTimestamp="2026-04-20 14:35:22 +0000 UTC" firstStartedPulling="2026-04-20 14:35:23.69353501 +0000 UTC m=+503.873023466" lastFinishedPulling="2026-04-20 14:35:27.00166939 +0000 UTC m=+507.181157838" observedRunningTime="2026-04-20 14:35:28.022094076 +0000 UTC m=+508.201582544" watchObservedRunningTime="2026-04-20 14:35:28.025310471 +0000 UTC m=+508.204798938" Apr 20 14:35:29.320314 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.320286 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw"] Apr 20 14:35:29.323666 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.323649 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.327936 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.327918 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:35:29.328568 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.328549 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-75g9b\"" Apr 20 14:35:29.328677 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.328626 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:35:29.342293 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.342269 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw"] Apr 20 14:35:29.440223 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.440192 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrtrm\" (UniqueName: \"kubernetes.io/projected/6940322a-a66b-45f4-880f-fe535ce7cc7f-kube-api-access-xrtrm\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.440366 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.440229 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.440366 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.440264 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.541194 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.541160 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.541358 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.541334 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrtrm\" (UniqueName: \"kubernetes.io/projected/6940322a-a66b-45f4-880f-fe535ce7cc7f-kube-api-access-xrtrm\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.541424 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.541410 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.541648 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.541627 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.541705 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.541682 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.552296 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.552275 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrtrm\" (UniqueName: \"kubernetes.io/projected/6940322a-a66b-45f4-880f-fe535ce7cc7f-kube-api-access-xrtrm\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.632427 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.632374 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:29.810375 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.810349 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw"] Apr 20 14:35:29.812372 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:35:29.812341 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6940322a_a66b_45f4_880f_fe535ce7cc7f.slice/crio-057618eefe6358bebeedd7586c35eeeb770361e936358911f6ba831e0847eb67 WatchSource:0}: Error finding container 057618eefe6358bebeedd7586c35eeeb770361e936358911f6ba831e0847eb67: Status 404 returned error can't find the container with id 057618eefe6358bebeedd7586c35eeeb770361e936358911f6ba831e0847eb67 Apr 20 14:35:29.997259 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.997229 2581 generic.go:358] "Generic (PLEG): container finished" podID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerID="946e610e168601a9222b52bcc8c51e23e3232b783892f98afa4a621f170790f9" exitCode=0 Apr 20 14:35:29.997397 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.997287 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" event={"ID":"6940322a-a66b-45f4-880f-fe535ce7cc7f","Type":"ContainerDied","Data":"946e610e168601a9222b52bcc8c51e23e3232b783892f98afa4a621f170790f9"} Apr 20 14:35:29.997397 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:29.997311 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" event={"ID":"6940322a-a66b-45f4-880f-fe535ce7cc7f","Type":"ContainerStarted","Data":"057618eefe6358bebeedd7586c35eeeb770361e936358911f6ba831e0847eb67"} Apr 20 14:35:30.549699 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.549661 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6"] Apr 20 14:35:30.552896 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.552879 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:30.555234 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.555215 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-zpxr2\"" Apr 20 14:35:30.555609 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.555586 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 14:35:30.555751 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.555714 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 14:35:30.567294 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.567274 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6"] Apr 20 14:35:30.650497 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.650472 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9dc03660-67fe-4971-b2b8-56958236fb9b-operator-config\") pod \"servicemesh-operator3-55f49c5f94-f7sv6\" (UID: \"9dc03660-67fe-4971-b2b8-56958236fb9b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:30.650582 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.650502 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8jlm\" (UniqueName: \"kubernetes.io/projected/9dc03660-67fe-4971-b2b8-56958236fb9b-kube-api-access-v8jlm\") pod \"servicemesh-operator3-55f49c5f94-f7sv6\" (UID: \"9dc03660-67fe-4971-b2b8-56958236fb9b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:30.751596 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.751534 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9dc03660-67fe-4971-b2b8-56958236fb9b-operator-config\") pod \"servicemesh-operator3-55f49c5f94-f7sv6\" (UID: \"9dc03660-67fe-4971-b2b8-56958236fb9b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:30.751596 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.751578 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8jlm\" (UniqueName: \"kubernetes.io/projected/9dc03660-67fe-4971-b2b8-56958236fb9b-kube-api-access-v8jlm\") pod \"servicemesh-operator3-55f49c5f94-f7sv6\" (UID: \"9dc03660-67fe-4971-b2b8-56958236fb9b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:30.754310 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.754290 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/9dc03660-67fe-4971-b2b8-56958236fb9b-operator-config\") pod \"servicemesh-operator3-55f49c5f94-f7sv6\" (UID: \"9dc03660-67fe-4971-b2b8-56958236fb9b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:30.762839 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.762818 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8jlm\" (UniqueName: \"kubernetes.io/projected/9dc03660-67fe-4971-b2b8-56958236fb9b-kube-api-access-v8jlm\") pod \"servicemesh-operator3-55f49c5f94-f7sv6\" (UID: \"9dc03660-67fe-4971-b2b8-56958236fb9b\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:30.867018 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:30.866948 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:31.004835 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:31.004812 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6"] Apr 20 14:35:31.006686 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:31.006661 2581 generic.go:358] "Generic (PLEG): container finished" podID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerID="57c883f2315cc636a7e7dfcce47b7e140455a28063f329a34d6af41a462364a0" exitCode=0 Apr 20 14:35:31.006803 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:31.006696 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" event={"ID":"6940322a-a66b-45f4-880f-fe535ce7cc7f","Type":"ContainerDied","Data":"57c883f2315cc636a7e7dfcce47b7e140455a28063f329a34d6af41a462364a0"} Apr 20 14:35:31.007786 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:35:31.007765 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc03660_67fe_4971_b2b8_56958236fb9b.slice/crio-337ca0ee2ed87cc8a846a1753f150567be8b15adc2274e446081ab6127bb28a5 WatchSource:0}: Error finding container 337ca0ee2ed87cc8a846a1753f150567be8b15adc2274e446081ab6127bb28a5: Status 404 returned error can't find the container with id 337ca0ee2ed87cc8a846a1753f150567be8b15adc2274e446081ab6127bb28a5 Apr 20 14:35:32.012001 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:32.011970 2581 generic.go:358] "Generic (PLEG): container finished" podID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerID="f7baa837ebbb9f2017c35422be4f48efb69f9570136986e85bd86c42a8aed948" exitCode=0 Apr 20 14:35:32.012405 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:32.012040 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" event={"ID":"6940322a-a66b-45f4-880f-fe535ce7cc7f","Type":"ContainerDied","Data":"f7baa837ebbb9f2017c35422be4f48efb69f9570136986e85bd86c42a8aed948"} Apr 20 14:35:32.013521 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:32.013499 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" event={"ID":"9dc03660-67fe-4971-b2b8-56958236fb9b","Type":"ContainerStarted","Data":"337ca0ee2ed87cc8a846a1753f150567be8b15adc2274e446081ab6127bb28a5"} Apr 20 14:35:33.144154 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.144129 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:33.173074 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.173052 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-util\") pod \"6940322a-a66b-45f4-880f-fe535ce7cc7f\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " Apr 20 14:35:33.173213 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.173082 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-bundle\") pod \"6940322a-a66b-45f4-880f-fe535ce7cc7f\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " Apr 20 14:35:33.173213 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.173116 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrtrm\" (UniqueName: \"kubernetes.io/projected/6940322a-a66b-45f4-880f-fe535ce7cc7f-kube-api-access-xrtrm\") pod \"6940322a-a66b-45f4-880f-fe535ce7cc7f\" (UID: \"6940322a-a66b-45f4-880f-fe535ce7cc7f\") " Apr 20 14:35:33.175248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.175219 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-bundle" (OuterVolumeSpecName: "bundle") pod "6940322a-a66b-45f4-880f-fe535ce7cc7f" (UID: "6940322a-a66b-45f4-880f-fe535ce7cc7f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:35:33.176807 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.176773 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6940322a-a66b-45f4-880f-fe535ce7cc7f-kube-api-access-xrtrm" (OuterVolumeSpecName: "kube-api-access-xrtrm") pod "6940322a-a66b-45f4-880f-fe535ce7cc7f" (UID: "6940322a-a66b-45f4-880f-fe535ce7cc7f"). InnerVolumeSpecName "kube-api-access-xrtrm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:35:33.180529 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.180505 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-util" (OuterVolumeSpecName: "util") pod "6940322a-a66b-45f4-880f-fe535ce7cc7f" (UID: "6940322a-a66b-45f4-880f-fe535ce7cc7f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:35:33.274461 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.274375 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:35:33.274461 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.274412 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940322a-a66b-45f4-880f-fe535ce7cc7f-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:35:33.274461 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:33.274429 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xrtrm\" (UniqueName: \"kubernetes.io/projected/6940322a-a66b-45f4-880f-fe535ce7cc7f-kube-api-access-xrtrm\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:35:34.023508 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.023477 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" Apr 20 14:35:34.023680 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.023479 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2mxfqw" event={"ID":"6940322a-a66b-45f4-880f-fe535ce7cc7f","Type":"ContainerDied","Data":"057618eefe6358bebeedd7586c35eeeb770361e936358911f6ba831e0847eb67"} Apr 20 14:35:34.023680 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.023595 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="057618eefe6358bebeedd7586c35eeeb770361e936358911f6ba831e0847eb67" Apr 20 14:35:34.405834 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.405737 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5ff44ddc76-xg9tb"] Apr 20 14:35:34.406247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.406101 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerName="util" Apr 20 14:35:34.406247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.406116 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerName="util" Apr 20 14:35:34.406247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.406127 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerName="pull" Apr 20 14:35:34.406247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.406132 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerName="pull" Apr 20 14:35:34.406247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.406139 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerName="extract" Apr 20 14:35:34.406247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.406145 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerName="extract" Apr 20 14:35:34.406247 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.406210 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6940322a-a66b-45f4-880f-fe535ce7cc7f" containerName="extract" Apr 20 14:35:34.412204 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.412012 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.419672 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.419647 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5ff44ddc76-xg9tb"] Apr 20 14:35:34.486108 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.486049 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-config\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.486280 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.486190 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6mmb\" (UniqueName: \"kubernetes.io/projected/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-kube-api-access-w6mmb\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.486280 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.486261 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-service-ca\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.486398 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.486326 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-trusted-ca-bundle\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.486398 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.486351 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-oauth-serving-cert\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.486505 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.486413 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-serving-cert\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.486505 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.486486 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-oauth-config\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.587146 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.587109 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-trusted-ca-bundle\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.587146 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.587144 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-oauth-serving-cert\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.587363 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.587183 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-serving-cert\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.587363 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.587235 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-oauth-config\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.587363 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.587299 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-config\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.587363 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.587329 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6mmb\" (UniqueName: \"kubernetes.io/projected/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-kube-api-access-w6mmb\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.587363 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.587356 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-service-ca\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.588072 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.587998 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-trusted-ca-bundle\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.588523 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.588474 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-service-ca\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.588631 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.588571 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-config\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.588631 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.588582 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-oauth-serving-cert\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.590610 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.590570 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-serving-cert\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.590808 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.590789 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-console-oauth-config\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.596313 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.596291 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6mmb\" (UniqueName: \"kubernetes.io/projected/6ab27e3b-7a3c-4245-a3f5-4de1786eafa2-kube-api-access-w6mmb\") pod \"console-5ff44ddc76-xg9tb\" (UID: \"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2\") " pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.728503 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.728462 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:34.978222 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:34.977478 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-nkhrj" Apr 20 14:35:35.057973 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:35.057942 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5ff44ddc76-xg9tb"] Apr 20 14:35:35.061183 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:35:35.061155 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ab27e3b_7a3c_4245_a3f5_4de1786eafa2.slice/crio-b26956989d7bce05f5320f7eacce0292091a02eed5fac2129e7f393dbf64dd1b WatchSource:0}: Error finding container b26956989d7bce05f5320f7eacce0292091a02eed5fac2129e7f393dbf64dd1b: Status 404 returned error can't find the container with id b26956989d7bce05f5320f7eacce0292091a02eed5fac2129e7f393dbf64dd1b Apr 20 14:35:36.032696 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:36.032661 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" event={"ID":"9dc03660-67fe-4971-b2b8-56958236fb9b","Type":"ContainerStarted","Data":"4a2062bf0876d246daaaa0ad74db20166c45a8329ff11a023c2e91c2a0582449"} Apr 20 14:35:36.033189 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:36.032746 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:36.034156 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:36.034134 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5ff44ddc76-xg9tb" event={"ID":"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2","Type":"ContainerStarted","Data":"bf184d2e404621cf180d69b4a42521217649d3b091059a7531256cccd3cf989d"} Apr 20 14:35:36.034276 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:36.034162 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5ff44ddc76-xg9tb" event={"ID":"6ab27e3b-7a3c-4245-a3f5-4de1786eafa2","Type":"ContainerStarted","Data":"b26956989d7bce05f5320f7eacce0292091a02eed5fac2129e7f393dbf64dd1b"} Apr 20 14:35:36.054335 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:36.054293 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" podStartSLOduration=2.100376182 podStartE2EDuration="6.054280837s" podCreationTimestamp="2026-04-20 14:35:30 +0000 UTC" firstStartedPulling="2026-04-20 14:35:31.010198296 +0000 UTC m=+511.189686748" lastFinishedPulling="2026-04-20 14:35:34.964102956 +0000 UTC m=+515.143591403" observedRunningTime="2026-04-20 14:35:36.052245664 +0000 UTC m=+516.231734131" watchObservedRunningTime="2026-04-20 14:35:36.054280837 +0000 UTC m=+516.233769304" Apr 20 14:35:36.071604 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:36.071564 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5ff44ddc76-xg9tb" podStartSLOduration=2.071552838 podStartE2EDuration="2.071552838s" podCreationTimestamp="2026-04-20 14:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:35:36.069519763 +0000 UTC m=+516.249008260" watchObservedRunningTime="2026-04-20 14:35:36.071552838 +0000 UTC m=+516.251041308" Apr 20 14:35:44.729418 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:44.729336 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:44.729823 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:44.729428 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:44.734172 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:44.734153 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:45.069973 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:45.069907 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5ff44ddc76-xg9tb" Apr 20 14:35:45.119475 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:45.119444 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84df7bcd6c-5t7v8"] Apr 20 14:35:46.309391 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.309355 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q"] Apr 20 14:35:46.312616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.312600 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.315524 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.315496 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 14:35:46.315636 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.315574 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 14:35:46.315636 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.315581 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 14:35:46.315774 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.315666 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-47r8k\"" Apr 20 14:35:46.315839 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.315807 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 14:35:46.326985 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.326964 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q"] Apr 20 14:35:46.385216 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.385196 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.385327 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.385250 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b1bb7fc-bf8e-4cc9-b695-286dac851053-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.385327 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.385276 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.385327 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.385312 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.385460 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.385338 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.385460 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.385361 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.385460 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.385381 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975vj\" (UniqueName: \"kubernetes.io/projected/6b1bb7fc-bf8e-4cc9-b695-286dac851053-kube-api-access-975vj\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.486112 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.486085 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.486245 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.486122 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.486245 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.486152 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-975vj\" (UniqueName: \"kubernetes.io/projected/6b1bb7fc-bf8e-4cc9-b695-286dac851053-kube-api-access-975vj\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.486245 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.486205 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.486393 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.486258 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b1bb7fc-bf8e-4cc9-b695-286dac851053-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.486393 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.486293 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.486393 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.486314 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.486887 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.486858 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.488831 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.488810 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.489079 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.489059 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.489661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.489633 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/6b1bb7fc-bf8e-4cc9-b695-286dac851053-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.489781 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.489645 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.494472 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.494453 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6b1bb7fc-bf8e-4cc9-b695-286dac851053-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.494573 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.494553 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-975vj\" (UniqueName: \"kubernetes.io/projected/6b1bb7fc-bf8e-4cc9-b695-286dac851053-kube-api-access-975vj\") pod \"istiod-openshift-gateway-55ff986f96-9zc9q\" (UID: \"6b1bb7fc-bf8e-4cc9-b695-286dac851053\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.622291 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.622216 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:46.755450 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:46.755428 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q"] Apr 20 14:35:46.757617 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:35:46.757585 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b1bb7fc_bf8e_4cc9_b695_286dac851053.slice/crio-7ec8c16996b1c75e59f9f194393c67d1c4a712dff3ab3ac5ea192bf50759278d WatchSource:0}: Error finding container 7ec8c16996b1c75e59f9f194393c67d1c4a712dff3ab3ac5ea192bf50759278d: Status 404 returned error can't find the container with id 7ec8c16996b1c75e59f9f194393c67d1c4a712dff3ab3ac5ea192bf50759278d Apr 20 14:35:47.039604 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:47.039575 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-f7sv6" Apr 20 14:35:47.077322 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:47.077290 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" event={"ID":"6b1bb7fc-bf8e-4cc9-b695-286dac851053","Type":"ContainerStarted","Data":"7ec8c16996b1c75e59f9f194393c67d1c4a712dff3ab3ac5ea192bf50759278d"} Apr 20 14:35:49.977737 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:49.977665 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 14:35:49.977984 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:49.977784 2581 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 20 14:35:50.090747 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:50.090689 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" event={"ID":"6b1bb7fc-bf8e-4cc9-b695-286dac851053","Type":"ContainerStarted","Data":"57a56eb9473575b0e3f4aec76ec829594993924ce66159d01fc8fd823083d000"} Apr 20 14:35:50.090932 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:50.090917 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:50.114791 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:50.114736 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" podStartSLOduration=0.896766139 podStartE2EDuration="4.114706327s" podCreationTimestamp="2026-04-20 14:35:46 +0000 UTC" firstStartedPulling="2026-04-20 14:35:46.759429934 +0000 UTC m=+526.938918380" lastFinishedPulling="2026-04-20 14:35:49.977370107 +0000 UTC m=+530.156858568" observedRunningTime="2026-04-20 14:35:50.112938159 +0000 UTC m=+530.292426626" watchObservedRunningTime="2026-04-20 14:35:50.114706327 +0000 UTC m=+530.294194796" Apr 20 14:35:51.097569 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:51.097533 2581 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-9zc9q container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 14:35:51.098168 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:51.098120 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" podUID="6b1bb7fc-bf8e-4cc9-b695-286dac851053" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 14:35:54.096833 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:54.096801 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-9zc9q" Apr 20 14:35:58.997439 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:35:58.997402 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-bs64c" Apr 20 14:36:10.142761 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.142697 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84df7bcd6c-5t7v8" podUID="6853c866-17da-4ff6-99a0-2bdc83042e36" containerName="console" containerID="cri-o://0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2" gracePeriod=15 Apr 20 14:36:10.385625 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.385593 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84df7bcd6c-5t7v8_6853c866-17da-4ff6-99a0-2bdc83042e36/console/0.log" Apr 20 14:36:10.385765 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.385647 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:36:10.484938 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.484913 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-console-config\") pod \"6853c866-17da-4ff6-99a0-2bdc83042e36\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " Apr 20 14:36:10.485087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.484966 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvdmc\" (UniqueName: \"kubernetes.io/projected/6853c866-17da-4ff6-99a0-2bdc83042e36-kube-api-access-pvdmc\") pod \"6853c866-17da-4ff6-99a0-2bdc83042e36\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " Apr 20 14:36:10.485087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.484997 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-service-ca\") pod \"6853c866-17da-4ff6-99a0-2bdc83042e36\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " Apr 20 14:36:10.485087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485020 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-oauth-serving-cert\") pod \"6853c866-17da-4ff6-99a0-2bdc83042e36\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " Apr 20 14:36:10.485087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485048 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-serving-cert\") pod \"6853c866-17da-4ff6-99a0-2bdc83042e36\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " Apr 20 14:36:10.485087 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485078 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-oauth-config\") pod \"6853c866-17da-4ff6-99a0-2bdc83042e36\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " Apr 20 14:36:10.485331 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485116 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-trusted-ca-bundle\") pod \"6853c866-17da-4ff6-99a0-2bdc83042e36\" (UID: \"6853c866-17da-4ff6-99a0-2bdc83042e36\") " Apr 20 14:36:10.485388 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485329 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-console-config" (OuterVolumeSpecName: "console-config") pod "6853c866-17da-4ff6-99a0-2bdc83042e36" (UID: "6853c866-17da-4ff6-99a0-2bdc83042e36"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:36:10.485543 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485518 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-console-config\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:10.485681 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485560 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6853c866-17da-4ff6-99a0-2bdc83042e36" (UID: "6853c866-17da-4ff6-99a0-2bdc83042e36"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:36:10.485681 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485576 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-service-ca" (OuterVolumeSpecName: "service-ca") pod "6853c866-17da-4ff6-99a0-2bdc83042e36" (UID: "6853c866-17da-4ff6-99a0-2bdc83042e36"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:36:10.486009 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.485818 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6853c866-17da-4ff6-99a0-2bdc83042e36" (UID: "6853c866-17da-4ff6-99a0-2bdc83042e36"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:36:10.487476 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.487445 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6853c866-17da-4ff6-99a0-2bdc83042e36" (UID: "6853c866-17da-4ff6-99a0-2bdc83042e36"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:36:10.487594 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.487573 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6853c866-17da-4ff6-99a0-2bdc83042e36-kube-api-access-pvdmc" (OuterVolumeSpecName: "kube-api-access-pvdmc") pod "6853c866-17da-4ff6-99a0-2bdc83042e36" (UID: "6853c866-17da-4ff6-99a0-2bdc83042e36"). InnerVolumeSpecName "kube-api-access-pvdmc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:36:10.487659 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.487574 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6853c866-17da-4ff6-99a0-2bdc83042e36" (UID: "6853c866-17da-4ff6-99a0-2bdc83042e36"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:36:10.586107 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.586068 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pvdmc\" (UniqueName: \"kubernetes.io/projected/6853c866-17da-4ff6-99a0-2bdc83042e36-kube-api-access-pvdmc\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:10.586107 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.586102 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-service-ca\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:10.586107 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.586111 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-oauth-serving-cert\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:10.586329 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.586120 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-serving-cert\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:10.586329 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.586129 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6853c866-17da-4ff6-99a0-2bdc83042e36-console-oauth-config\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:10.586329 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:10.586137 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6853c866-17da-4ff6-99a0-2bdc83042e36-trusted-ca-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:11.178250 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.178225 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84df7bcd6c-5t7v8_6853c866-17da-4ff6-99a0-2bdc83042e36/console/0.log" Apr 20 14:36:11.178624 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.178263 2581 generic.go:358] "Generic (PLEG): container finished" podID="6853c866-17da-4ff6-99a0-2bdc83042e36" containerID="0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2" exitCode=2 Apr 20 14:36:11.178624 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.178326 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84df7bcd6c-5t7v8" Apr 20 14:36:11.178624 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.178360 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84df7bcd6c-5t7v8" event={"ID":"6853c866-17da-4ff6-99a0-2bdc83042e36","Type":"ContainerDied","Data":"0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2"} Apr 20 14:36:11.178624 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.178393 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84df7bcd6c-5t7v8" event={"ID":"6853c866-17da-4ff6-99a0-2bdc83042e36","Type":"ContainerDied","Data":"bf63a60d4ef515571073de2227cf8ea75823c4bab551547000bf8def9497762b"} Apr 20 14:36:11.178624 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.178412 2581 scope.go:117] "RemoveContainer" containerID="0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2" Apr 20 14:36:11.187407 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.187389 2581 scope.go:117] "RemoveContainer" containerID="0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2" Apr 20 14:36:11.187675 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:36:11.187660 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2\": container with ID starting with 0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2 not found: ID does not exist" containerID="0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2" Apr 20 14:36:11.187768 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.187681 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2"} err="failed to get container status \"0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2\": rpc error: code = NotFound desc = could not find container \"0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2\": container with ID starting with 0a3d954aa3b3c9554b29792d7fac2632f3955df11e900c400b54fb6e6421b1a2 not found: ID does not exist" Apr 20 14:36:11.199828 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.199805 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84df7bcd6c-5t7v8"] Apr 20 14:36:11.204200 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:11.204181 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84df7bcd6c-5t7v8"] Apr 20 14:36:12.387371 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:12.387339 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6853c866-17da-4ff6-99a0-2bdc83042e36" path="/var/lib/kubelet/pods/6853c866-17da-4ff6-99a0-2bdc83042e36/volumes" Apr 20 14:36:29.776169 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.776136 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5"] Apr 20 14:36:29.776548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.776425 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6853c866-17da-4ff6-99a0-2bdc83042e36" containerName="console" Apr 20 14:36:29.776548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.776436 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6853c866-17da-4ff6-99a0-2bdc83042e36" containerName="console" Apr 20 14:36:29.776548 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.776504 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6853c866-17da-4ff6-99a0-2bdc83042e36" containerName="console" Apr 20 14:36:29.781741 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.781707 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.784498 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.784473 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 14:36:29.785217 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.785194 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 14:36:29.785453 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.785261 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jksts\"" Apr 20 14:36:29.786841 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.786823 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5"] Apr 20 14:36:29.835180 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.835150 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg2lh\" (UniqueName: \"kubernetes.io/projected/c5d3f867-bd46-458e-abe1-18fcecec62c0-kube-api-access-lg2lh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.835288 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.835184 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.835288 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.835212 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.935758 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.935715 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg2lh\" (UniqueName: \"kubernetes.io/projected/c5d3f867-bd46-458e-abe1-18fcecec62c0-kube-api-access-lg2lh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.935855 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.935769 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.935855 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.935804 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.936205 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.936186 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.936242 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.936206 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:29.944029 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:29.944005 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg2lh\" (UniqueName: \"kubernetes.io/projected/c5d3f867-bd46-458e-abe1-18fcecec62c0-kube-api-access-lg2lh\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:30.091781 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.091663 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:30.367957 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.367864 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655"] Apr 20 14:36:30.370648 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.370624 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.378343 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.378164 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655"] Apr 20 14:36:30.424330 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.424302 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5"] Apr 20 14:36:30.426806 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:36:30.426777 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d3f867_bd46_458e_abe1_18fcecec62c0.slice/crio-51afaf934be5e683ed706efda76f7765bbbbb977593fb827764a237c18f25415 WatchSource:0}: Error finding container 51afaf934be5e683ed706efda76f7765bbbbb977593fb827764a237c18f25415: Status 404 returned error can't find the container with id 51afaf934be5e683ed706efda76f7765bbbbb977593fb827764a237c18f25415 Apr 20 14:36:30.439658 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.439633 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.439785 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.439701 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.439860 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.439779 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh685\" (UniqueName: \"kubernetes.io/projected/e6852ab4-9ca9-4ec0-8f18-753fa397a740-kube-api-access-bh685\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.540982 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.540959 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.541076 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.541003 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh685\" (UniqueName: \"kubernetes.io/projected/e6852ab4-9ca9-4ec0-8f18-753fa397a740-kube-api-access-bh685\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.541076 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.541061 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.541284 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.541268 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.541321 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.541310 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.548555 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.548538 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh685\" (UniqueName: \"kubernetes.io/projected/e6852ab4-9ca9-4ec0-8f18-753fa397a740-kube-api-access-bh685\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.681631 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.681584 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:30.801334 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.801300 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655"] Apr 20 14:36:30.806859 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:36:30.806826 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6852ab4_9ca9_4ec0_8f18_753fa397a740.slice/crio-7ee2b6fc4bf170658ad6cd50157e2c462a4fbdbac0f8c0d12e24d15b5b538d9e WatchSource:0}: Error finding container 7ee2b6fc4bf170658ad6cd50157e2c462a4fbdbac0f8c0d12e24d15b5b538d9e: Status 404 returned error can't find the container with id 7ee2b6fc4bf170658ad6cd50157e2c462a4fbdbac0f8c0d12e24d15b5b538d9e Apr 20 14:36:30.973480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.973453 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn"] Apr 20 14:36:30.975777 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.975763 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:30.984233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:30.984213 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn"] Apr 20 14:36:31.044823 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.044800 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.044937 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.044871 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.044937 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.044897 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zndq\" (UniqueName: \"kubernetes.io/projected/f3816e90-952c-4608-882c-5ed7a62ee98e-kube-api-access-2zndq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.145792 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.145757 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.145916 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.145847 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.145916 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.145882 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zndq\" (UniqueName: \"kubernetes.io/projected/f3816e90-952c-4608-882c-5ed7a62ee98e-kube-api-access-2zndq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.146218 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.146187 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.146294 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.146197 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.154068 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.154047 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zndq\" (UniqueName: \"kubernetes.io/projected/f3816e90-952c-4608-882c-5ed7a62ee98e-kube-api-access-2zndq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.251901 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.251873 2581 generic.go:358] "Generic (PLEG): container finished" podID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerID="4e32e3fd66a4b70b00595328f0942790f5390a0dc051bfba069eb8fe867ac95c" exitCode=0 Apr 20 14:36:31.252043 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.251961 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" event={"ID":"e6852ab4-9ca9-4ec0-8f18-753fa397a740","Type":"ContainerDied","Data":"4e32e3fd66a4b70b00595328f0942790f5390a0dc051bfba069eb8fe867ac95c"} Apr 20 14:36:31.252043 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.251992 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" event={"ID":"e6852ab4-9ca9-4ec0-8f18-753fa397a740","Type":"ContainerStarted","Data":"7ee2b6fc4bf170658ad6cd50157e2c462a4fbdbac0f8c0d12e24d15b5b538d9e"} Apr 20 14:36:31.253372 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.253352 2581 generic.go:358] "Generic (PLEG): container finished" podID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerID="ca91a9159b6b6341323ecf558a75ca76f71918abb4f4486f68f71c7bede75869" exitCode=0 Apr 20 14:36:31.253472 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.253390 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" event={"ID":"c5d3f867-bd46-458e-abe1-18fcecec62c0","Type":"ContainerDied","Data":"ca91a9159b6b6341323ecf558a75ca76f71918abb4f4486f68f71c7bede75869"} Apr 20 14:36:31.253472 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.253408 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" event={"ID":"c5d3f867-bd46-458e-abe1-18fcecec62c0","Type":"ContainerStarted","Data":"51afaf934be5e683ed706efda76f7765bbbbb977593fb827764a237c18f25415"} Apr 20 14:36:31.285784 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.285748 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:31.374832 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.374800 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr"] Apr 20 14:36:31.377919 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.377895 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.387020 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.386993 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr"] Apr 20 14:36:31.410180 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.410159 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn"] Apr 20 14:36:31.412131 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:36:31.412106 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3816e90_952c_4608_882c_5ed7a62ee98e.slice/crio-3264d7b7ff7117b3a8b2ee50db5369b6b4a8ac599ccf5e653426e93da528b165 WatchSource:0}: Error finding container 3264d7b7ff7117b3a8b2ee50db5369b6b4a8ac599ccf5e653426e93da528b165: Status 404 returned error can't find the container with id 3264d7b7ff7117b3a8b2ee50db5369b6b4a8ac599ccf5e653426e93da528b165 Apr 20 14:36:31.449107 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.449082 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.449199 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.449137 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4f6g\" (UniqueName: \"kubernetes.io/projected/b7273032-549f-4b91-a2a8-308fe2e1e6ae-kube-api-access-g4f6g\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.449259 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.449204 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.550226 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.550199 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.550347 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.550253 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4f6g\" (UniqueName: \"kubernetes.io/projected/b7273032-549f-4b91-a2a8-308fe2e1e6ae-kube-api-access-g4f6g\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.550347 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.550323 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.550636 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.550617 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.550675 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.550625 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.558788 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.558764 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4f6g\" (UniqueName: \"kubernetes.io/projected/b7273032-549f-4b91-a2a8-308fe2e1e6ae-kube-api-access-g4f6g\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.691105 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.691075 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:31.815365 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:31.815339 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr"] Apr 20 14:36:31.817173 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:36:31.817141 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7273032_549f_4b91_a2a8_308fe2e1e6ae.slice/crio-8c691de165dbc738af9f7a70afe9da4f1b52ff898511e95a1cd4a2f3b7a04954 WatchSource:0}: Error finding container 8c691de165dbc738af9f7a70afe9da4f1b52ff898511e95a1cd4a2f3b7a04954: Status 404 returned error can't find the container with id 8c691de165dbc738af9f7a70afe9da4f1b52ff898511e95a1cd4a2f3b7a04954 Apr 20 14:36:32.259572 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:32.259548 2581 generic.go:358] "Generic (PLEG): container finished" podID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerID="bd4d03f24658ddda941c723cf3d569a6327c5625c8038a0821795dd8c475115a" exitCode=0 Apr 20 14:36:32.259670 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:32.259628 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" event={"ID":"f3816e90-952c-4608-882c-5ed7a62ee98e","Type":"ContainerDied","Data":"bd4d03f24658ddda941c723cf3d569a6327c5625c8038a0821795dd8c475115a"} Apr 20 14:36:32.259670 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:32.259661 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" event={"ID":"f3816e90-952c-4608-882c-5ed7a62ee98e","Type":"ContainerStarted","Data":"3264d7b7ff7117b3a8b2ee50db5369b6b4a8ac599ccf5e653426e93da528b165"} Apr 20 14:36:32.261253 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:32.261185 2581 generic.go:358] "Generic (PLEG): container finished" podID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerID="77bc50d374c1044b53da27203c72498314b9e8dfd421a6dbf606ec9aad50663a" exitCode=0 Apr 20 14:36:32.261253 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:32.261217 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" event={"ID":"b7273032-549f-4b91-a2a8-308fe2e1e6ae","Type":"ContainerDied","Data":"77bc50d374c1044b53da27203c72498314b9e8dfd421a6dbf606ec9aad50663a"} Apr 20 14:36:32.261253 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:32.261248 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" event={"ID":"b7273032-549f-4b91-a2a8-308fe2e1e6ae","Type":"ContainerStarted","Data":"8c691de165dbc738af9f7a70afe9da4f1b52ff898511e95a1cd4a2f3b7a04954"} Apr 20 14:36:33.266316 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:33.266291 2581 generic.go:358] "Generic (PLEG): container finished" podID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerID="3abdf159d97672fc82c506824cbc75af83b9b4a50fb598e3a14ca8c40ecc9c1c" exitCode=0 Apr 20 14:36:33.266656 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:33.266379 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" event={"ID":"e6852ab4-9ca9-4ec0-8f18-753fa397a740","Type":"ContainerDied","Data":"3abdf159d97672fc82c506824cbc75af83b9b4a50fb598e3a14ca8c40ecc9c1c"} Apr 20 14:36:33.268172 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:33.268147 2581 generic.go:358] "Generic (PLEG): container finished" podID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerID="7eedecd3d0b1cedf8b72788b7de91d111ab93a29947482780181a25a38f06d93" exitCode=0 Apr 20 14:36:33.268262 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:33.268225 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" event={"ID":"b7273032-549f-4b91-a2a8-308fe2e1e6ae","Type":"ContainerDied","Data":"7eedecd3d0b1cedf8b72788b7de91d111ab93a29947482780181a25a38f06d93"} Apr 20 14:36:33.270026 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:33.270004 2581 generic.go:358] "Generic (PLEG): container finished" podID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerID="07df3fd4a05c0ff1f29a9e396eed6b7da112685a599d68464706edf397d71018" exitCode=0 Apr 20 14:36:33.270125 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:33.270065 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" event={"ID":"c5d3f867-bd46-458e-abe1-18fcecec62c0","Type":"ContainerDied","Data":"07df3fd4a05c0ff1f29a9e396eed6b7da112685a599d68464706edf397d71018"} Apr 20 14:36:33.272115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:33.272011 2581 generic.go:358] "Generic (PLEG): container finished" podID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerID="c205f12524faece30c975979429deceb4d4f73187a1c944046dc2dd138011815" exitCode=0 Apr 20 14:36:33.272115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:33.272041 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" event={"ID":"f3816e90-952c-4608-882c-5ed7a62ee98e","Type":"ContainerDied","Data":"c205f12524faece30c975979429deceb4d4f73187a1c944046dc2dd138011815"} Apr 20 14:36:34.284238 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:34.284152 2581 generic.go:358] "Generic (PLEG): container finished" podID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerID="4948918a2cd02ffaefa83bd277f9e33913073b7e97920d8b8120d76c55e70ae2" exitCode=0 Apr 20 14:36:34.284238 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:34.284226 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" event={"ID":"c5d3f867-bd46-458e-abe1-18fcecec62c0","Type":"ContainerDied","Data":"4948918a2cd02ffaefa83bd277f9e33913073b7e97920d8b8120d76c55e70ae2"} Apr 20 14:36:34.286062 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:34.286037 2581 generic.go:358] "Generic (PLEG): container finished" podID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerID="cec9a7d05058cfdf0e3f9485f1aa2635e89fb08f1839d5cb2e02e73ff7fe6853" exitCode=0 Apr 20 14:36:34.286150 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:34.286116 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" event={"ID":"f3816e90-952c-4608-882c-5ed7a62ee98e","Type":"ContainerDied","Data":"cec9a7d05058cfdf0e3f9485f1aa2635e89fb08f1839d5cb2e02e73ff7fe6853"} Apr 20 14:36:34.287956 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:34.287935 2581 generic.go:358] "Generic (PLEG): container finished" podID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerID="9d6e73ea3b6af2da838853cc5157be8a756ac4ff918619c386347b406de740f2" exitCode=0 Apr 20 14:36:34.288051 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:34.288020 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" event={"ID":"e6852ab4-9ca9-4ec0-8f18-753fa397a740","Type":"ContainerDied","Data":"9d6e73ea3b6af2da838853cc5157be8a756ac4ff918619c386347b406de740f2"} Apr 20 14:36:34.289718 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:34.289697 2581 generic.go:358] "Generic (PLEG): container finished" podID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerID="d4e873aa45e908661ca02f36fc7ebe165f26ebe30a09f60908c0f050f3c1e24e" exitCode=0 Apr 20 14:36:34.289865 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:34.289752 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" event={"ID":"b7273032-549f-4b91-a2a8-308fe2e1e6ae","Type":"ContainerDied","Data":"d4e873aa45e908661ca02f36fc7ebe165f26ebe30a09f60908c0f050f3c1e24e"} Apr 20 14:36:35.427223 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.427200 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:35.483481 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.483457 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:35.487402 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.487384 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:35.487642 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.487631 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-bundle\") pod \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " Apr 20 14:36:35.487687 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.487652 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-util\") pod \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " Apr 20 14:36:35.487763 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.487744 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh685\" (UniqueName: \"kubernetes.io/projected/e6852ab4-9ca9-4ec0-8f18-753fa397a740-kube-api-access-bh685\") pod \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\" (UID: \"e6852ab4-9ca9-4ec0-8f18-753fa397a740\") " Apr 20 14:36:35.488142 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.488120 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-bundle" (OuterVolumeSpecName: "bundle") pod "e6852ab4-9ca9-4ec0-8f18-753fa397a740" (UID: "e6852ab4-9ca9-4ec0-8f18-753fa397a740"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:36:35.490006 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.489984 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6852ab4-9ca9-4ec0-8f18-753fa397a740-kube-api-access-bh685" (OuterVolumeSpecName: "kube-api-access-bh685") pod "e6852ab4-9ca9-4ec0-8f18-753fa397a740" (UID: "e6852ab4-9ca9-4ec0-8f18-753fa397a740"). InnerVolumeSpecName "kube-api-access-bh685". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:36:35.491562 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.491548 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:35.493362 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.493338 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-util" (OuterVolumeSpecName: "util") pod "e6852ab4-9ca9-4ec0-8f18-753fa397a740" (UID: "e6852ab4-9ca9-4ec0-8f18-753fa397a740"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:36:35.589056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.588984 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-util\") pod \"c5d3f867-bd46-458e-abe1-18fcecec62c0\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " Apr 20 14:36:35.589056 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589026 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-util\") pod \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " Apr 20 14:36:35.589246 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589076 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4f6g\" (UniqueName: \"kubernetes.io/projected/b7273032-549f-4b91-a2a8-308fe2e1e6ae-kube-api-access-g4f6g\") pod \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " Apr 20 14:36:35.589246 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589102 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-bundle\") pod \"c5d3f867-bd46-458e-abe1-18fcecec62c0\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " Apr 20 14:36:35.589246 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589142 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zndq\" (UniqueName: \"kubernetes.io/projected/f3816e90-952c-4608-882c-5ed7a62ee98e-kube-api-access-2zndq\") pod \"f3816e90-952c-4608-882c-5ed7a62ee98e\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " Apr 20 14:36:35.589246 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589171 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-bundle\") pod \"f3816e90-952c-4608-882c-5ed7a62ee98e\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " Apr 20 14:36:35.589246 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589226 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-bundle\") pod \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\" (UID: \"b7273032-549f-4b91-a2a8-308fe2e1e6ae\") " Apr 20 14:36:35.589483 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589259 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-util\") pod \"f3816e90-952c-4608-882c-5ed7a62ee98e\" (UID: \"f3816e90-952c-4608-882c-5ed7a62ee98e\") " Apr 20 14:36:35.589483 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589291 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg2lh\" (UniqueName: \"kubernetes.io/projected/c5d3f867-bd46-458e-abe1-18fcecec62c0-kube-api-access-lg2lh\") pod \"c5d3f867-bd46-458e-abe1-18fcecec62c0\" (UID: \"c5d3f867-bd46-458e-abe1-18fcecec62c0\") " Apr 20 14:36:35.589585 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589541 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.589585 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589556 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e6852ab4-9ca9-4ec0-8f18-753fa397a740-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.589585 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589571 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bh685\" (UniqueName: \"kubernetes.io/projected/e6852ab4-9ca9-4ec0-8f18-753fa397a740-kube-api-access-bh685\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.589949 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589922 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-bundle" (OuterVolumeSpecName: "bundle") pod "c5d3f867-bd46-458e-abe1-18fcecec62c0" (UID: "c5d3f867-bd46-458e-abe1-18fcecec62c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:36:35.590032 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.589916 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-bundle" (OuterVolumeSpecName: "bundle") pod "b7273032-549f-4b91-a2a8-308fe2e1e6ae" (UID: "b7273032-549f-4b91-a2a8-308fe2e1e6ae"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:36:35.590464 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.590417 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-bundle" (OuterVolumeSpecName: "bundle") pod "f3816e90-952c-4608-882c-5ed7a62ee98e" (UID: "f3816e90-952c-4608-882c-5ed7a62ee98e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:36:35.591702 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.591678 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3816e90-952c-4608-882c-5ed7a62ee98e-kube-api-access-2zndq" (OuterVolumeSpecName: "kube-api-access-2zndq") pod "f3816e90-952c-4608-882c-5ed7a62ee98e" (UID: "f3816e90-952c-4608-882c-5ed7a62ee98e"). InnerVolumeSpecName "kube-api-access-2zndq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:36:35.592006 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.591987 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7273032-549f-4b91-a2a8-308fe2e1e6ae-kube-api-access-g4f6g" (OuterVolumeSpecName: "kube-api-access-g4f6g") pod "b7273032-549f-4b91-a2a8-308fe2e1e6ae" (UID: "b7273032-549f-4b91-a2a8-308fe2e1e6ae"). InnerVolumeSpecName "kube-api-access-g4f6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:36:35.592138 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.592115 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d3f867-bd46-458e-abe1-18fcecec62c0-kube-api-access-lg2lh" (OuterVolumeSpecName: "kube-api-access-lg2lh") pod "c5d3f867-bd46-458e-abe1-18fcecec62c0" (UID: "c5d3f867-bd46-458e-abe1-18fcecec62c0"). InnerVolumeSpecName "kube-api-access-lg2lh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:36:35.595294 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.595262 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-util" (OuterVolumeSpecName: "util") pod "b7273032-549f-4b91-a2a8-308fe2e1e6ae" (UID: "b7273032-549f-4b91-a2a8-308fe2e1e6ae"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:36:35.595915 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.595895 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-util" (OuterVolumeSpecName: "util") pod "c5d3f867-bd46-458e-abe1-18fcecec62c0" (UID: "c5d3f867-bd46-458e-abe1-18fcecec62c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:36:35.596345 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.596324 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-util" (OuterVolumeSpecName: "util") pod "f3816e90-952c-4608-882c-5ed7a62ee98e" (UID: "f3816e90-952c-4608-882c-5ed7a62ee98e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:36:35.690515 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690489 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4f6g\" (UniqueName: \"kubernetes.io/projected/b7273032-549f-4b91-a2a8-308fe2e1e6ae-kube-api-access-g4f6g\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.690515 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690511 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.690644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690521 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zndq\" (UniqueName: \"kubernetes.io/projected/f3816e90-952c-4608-882c-5ed7a62ee98e-kube-api-access-2zndq\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.690644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690530 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.690644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690539 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-bundle\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.690644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690556 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3816e90-952c-4608-882c-5ed7a62ee98e-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.690644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690565 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lg2lh\" (UniqueName: \"kubernetes.io/projected/c5d3f867-bd46-458e-abe1-18fcecec62c0-kube-api-access-lg2lh\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.690644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690573 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5d3f867-bd46-458e-abe1-18fcecec62c0-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:35.690644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:35.690581 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7273032-549f-4b91-a2a8-308fe2e1e6ae-util\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:36:36.299104 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.299073 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" event={"ID":"c5d3f867-bd46-458e-abe1-18fcecec62c0","Type":"ContainerDied","Data":"51afaf934be5e683ed706efda76f7765bbbbb977593fb827764a237c18f25415"} Apr 20 14:36:36.299104 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.299097 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5" Apr 20 14:36:36.299337 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.299104 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51afaf934be5e683ed706efda76f7765bbbbb977593fb827764a237c18f25415" Apr 20 14:36:36.300960 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.300938 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" event={"ID":"f3816e90-952c-4608-882c-5ed7a62ee98e","Type":"ContainerDied","Data":"3264d7b7ff7117b3a8b2ee50db5369b6b4a8ac599ccf5e653426e93da528b165"} Apr 20 14:36:36.301081 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.300963 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3264d7b7ff7117b3a8b2ee50db5369b6b4a8ac599ccf5e653426e93da528b165" Apr 20 14:36:36.301081 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.300949 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn" Apr 20 14:36:36.302780 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.302755 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" event={"ID":"e6852ab4-9ca9-4ec0-8f18-753fa397a740","Type":"ContainerDied","Data":"7ee2b6fc4bf170658ad6cd50157e2c462a4fbdbac0f8c0d12e24d15b5b538d9e"} Apr 20 14:36:36.302876 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.302786 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ee2b6fc4bf170658ad6cd50157e2c462a4fbdbac0f8c0d12e24d15b5b538d9e" Apr 20 14:36:36.302876 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.302760 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655" Apr 20 14:36:36.304574 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.304554 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" event={"ID":"b7273032-549f-4b91-a2a8-308fe2e1e6ae","Type":"ContainerDied","Data":"8c691de165dbc738af9f7a70afe9da4f1b52ff898511e95a1cd4a2f3b7a04954"} Apr 20 14:36:36.304657 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.304579 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c691de165dbc738af9f7a70afe9da4f1b52ff898511e95a1cd4a2f3b7a04954" Apr 20 14:36:36.304657 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:36.304606 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr" Apr 20 14:36:48.909185 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909149 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r"] Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909465 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerName="extract" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909479 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerName="extract" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909489 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerName="util" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909494 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerName="util" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909500 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerName="pull" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909506 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerName="pull" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909514 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerName="util" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909519 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerName="util" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909526 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerName="pull" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909532 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerName="pull" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909540 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerName="extract" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909548 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerName="extract" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909558 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerName="util" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909562 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerName="util" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909568 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerName="extract" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909572 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerName="extract" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909578 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerName="util" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909582 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerName="util" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909588 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerName="pull" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909593 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerName="pull" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909605 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerName="pull" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909609 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerName="pull" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909615 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerName="extract" Apr 20 14:36:48.909616 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909622 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerName="extract" Apr 20 14:36:48.910335 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909687 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3816e90-952c-4608-882c-5ed7a62ee98e" containerName="extract" Apr 20 14:36:48.910335 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909699 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6852ab4-9ca9-4ec0-8f18-753fa397a740" containerName="extract" Apr 20 14:36:48.910335 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909711 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7273032-549f-4b91-a2a8-308fe2e1e6ae" containerName="extract" Apr 20 14:36:48.910335 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.909743 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5d3f867-bd46-458e-abe1-18fcecec62c0" containerName="extract" Apr 20 14:36:48.912714 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.912693 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:48.915553 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.915522 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 14:36:48.915675 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.915593 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 14:36:48.915675 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.915609 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-dx4gf\"" Apr 20 14:36:48.927031 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.927005 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r"] Apr 20 14:36:48.993574 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.993544 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxw57\" (UniqueName: \"kubernetes.io/projected/6f8fc969-659d-4443-94b7-ef15298b76a7-kube-api-access-dxw57\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" (UID: \"6f8fc969-659d-4443-94b7-ef15298b76a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:48.993710 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:48.993666 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6f8fc969-659d-4443-94b7-ef15298b76a7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" (UID: \"6f8fc969-659d-4443-94b7-ef15298b76a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:49.094505 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:49.094483 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6f8fc969-659d-4443-94b7-ef15298b76a7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" (UID: \"6f8fc969-659d-4443-94b7-ef15298b76a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:49.094633 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:49.094545 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxw57\" (UniqueName: \"kubernetes.io/projected/6f8fc969-659d-4443-94b7-ef15298b76a7-kube-api-access-dxw57\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" (UID: \"6f8fc969-659d-4443-94b7-ef15298b76a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:49.094989 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:49.094969 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6f8fc969-659d-4443-94b7-ef15298b76a7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" (UID: \"6f8fc969-659d-4443-94b7-ef15298b76a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:49.102996 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:49.102976 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxw57\" (UniqueName: \"kubernetes.io/projected/6f8fc969-659d-4443-94b7-ef15298b76a7-kube-api-access-dxw57\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" (UID: \"6f8fc969-659d-4443-94b7-ef15298b76a7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:49.223238 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:49.223210 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:49.348606 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:49.348575 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r"] Apr 20 14:36:49.350336 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:36:49.350309 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f8fc969_659d_4443_94b7_ef15298b76a7.slice/crio-f5039ce756bc0ea07f4a6852eb8c2be8e746cbd7a46b376ca7579efae61c9bbe WatchSource:0}: Error finding container f5039ce756bc0ea07f4a6852eb8c2be8e746cbd7a46b376ca7579efae61c9bbe: Status 404 returned error can't find the container with id f5039ce756bc0ea07f4a6852eb8c2be8e746cbd7a46b376ca7579efae61c9bbe Apr 20 14:36:50.358610 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:50.358573 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" event={"ID":"6f8fc969-659d-4443-94b7-ef15298b76a7","Type":"ContainerStarted","Data":"f5039ce756bc0ea07f4a6852eb8c2be8e746cbd7a46b376ca7579efae61c9bbe"} Apr 20 14:36:56.386487 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:56.386460 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:36:56.386835 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:56.386492 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" event={"ID":"6f8fc969-659d-4443-94b7-ef15298b76a7","Type":"ContainerStarted","Data":"85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011"} Apr 20 14:36:56.404524 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:36:56.404479 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" podStartSLOduration=1.51739746 podStartE2EDuration="8.404467297s" podCreationTimestamp="2026-04-20 14:36:48 +0000 UTC" firstStartedPulling="2026-04-20 14:36:49.352640526 +0000 UTC m=+589.532128983" lastFinishedPulling="2026-04-20 14:36:56.239710362 +0000 UTC m=+596.419198820" observedRunningTime="2026-04-20 14:36:56.401119641 +0000 UTC m=+596.580608109" watchObservedRunningTime="2026-04-20 14:36:56.404467297 +0000 UTC m=+596.583955765" Apr 20 14:37:00.288428 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:00.288399 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:37:00.288912 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:00.288461 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:37:00.299751 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:00.299714 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:37:00.299859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:00.299754 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:37:07.389920 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:07.389833 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:37:08.519444 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.519405 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl"] Apr 20 14:37:08.522585 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.522569 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:08.532171 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.532144 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl"] Apr 20 14:37:08.621300 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.621276 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl"] Apr 20 14:37:08.621470 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:37:08.621453 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-sv476], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" podUID="2444eb35-d4b7-4324-b30f-ac6a43c96d53" Apr 20 14:37:08.650134 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.650107 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2444eb35-d4b7-4324-b30f-ac6a43c96d53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" (UID: \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:08.650244 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.650153 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv476\" (UniqueName: \"kubernetes.io/projected/2444eb35-d4b7-4324-b30f-ac6a43c96d53-kube-api-access-sv476\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" (UID: \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:08.751192 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.751153 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2444eb35-d4b7-4324-b30f-ac6a43c96d53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" (UID: \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:08.751316 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.751214 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv476\" (UniqueName: \"kubernetes.io/projected/2444eb35-d4b7-4324-b30f-ac6a43c96d53-kube-api-access-sv476\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" (UID: \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:08.751523 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.751501 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2444eb35-d4b7-4324-b30f-ac6a43c96d53-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" (UID: \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:08.759779 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:08.759756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv476\" (UniqueName: \"kubernetes.io/projected/2444eb35-d4b7-4324-b30f-ac6a43c96d53-kube-api-access-sv476\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" (UID: \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:09.179112 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.179083 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r"] Apr 20 14:37:09.179420 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.179371 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" containerName="manager" containerID="cri-o://85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011" gracePeriod=2 Apr 20 14:37:09.196326 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.196295 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r"] Apr 20 14:37:09.212465 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.212442 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz"] Apr 20 14:37:09.212837 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.212814 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" containerName="manager" Apr 20 14:37:09.212837 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.212837 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" containerName="manager" Apr 20 14:37:09.213021 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.212932 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" containerName="manager" Apr 20 14:37:09.216096 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.216076 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl"] Apr 20 14:37:09.216202 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.216180 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:09.218657 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.218624 2581 status_manager.go:895] "Failed to get status for pod" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:09.231493 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.231471 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl"] Apr 20 14:37:09.234937 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.234892 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz"] Apr 20 14:37:09.292221 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.292196 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92"] Apr 20 14:37:09.295420 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.295403 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:09.315533 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.315502 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92"] Apr 20 14:37:09.355079 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.355047 2581 status_manager.go:895] "Failed to get status for pod" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:09.355369 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.355351 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglm9\" (UniqueName: \"kubernetes.io/projected/ab279091-c87c-4718-8f39-b1de243fb031-kube-api-access-xglm9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wxkz\" (UID: \"ab279091-c87c-4718-8f39-b1de243fb031\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:09.355437 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.355387 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab279091-c87c-4718-8f39-b1de243fb031-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wxkz\" (UID: \"ab279091-c87c-4718-8f39-b1de243fb031\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:09.407751 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.407712 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:37:09.410071 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.410047 2581 status_manager.go:895] "Failed to get status for pod" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:09.428468 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.428446 2581 generic.go:358] "Generic (PLEG): container finished" podID="6f8fc969-659d-4443-94b7-ef15298b76a7" containerID="85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011" exitCode=0 Apr 20 14:37:09.428558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.428487 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" Apr 20 14:37:09.428558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.428532 2581 scope.go:117] "RemoveContainer" containerID="85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011" Apr 20 14:37:09.428651 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.428589 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:09.431379 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.431358 2581 status_manager.go:895] "Failed to get status for pod" podUID="2444eb35-d4b7-4324-b30f-ac6a43c96d53" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:09.432885 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.432867 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:09.433416 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.433392 2581 status_manager.go:895] "Failed to get status for pod" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:09.435278 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.435260 2581 status_manager.go:895] "Failed to get status for pod" podUID="2444eb35-d4b7-4324-b30f-ac6a43c96d53" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:09.437123 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.437110 2581 scope.go:117] "RemoveContainer" containerID="85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011" Apr 20 14:37:09.437205 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.437180 2581 status_manager.go:895] "Failed to get status for pod" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:09.437375 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:37:09.437355 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011\": container with ID starting with 85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011 not found: ID does not exist" containerID="85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011" Apr 20 14:37:09.437420 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.437382 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011"} err="failed to get container status \"85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011\": rpc error: code = NotFound desc = could not find container \"85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011\": container with ID starting with 85d5e12402402f37aee7bf8069804e304b11b34da9a04d81277f540e060a5011 not found: ID does not exist" Apr 20 14:37:09.456812 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.456789 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xglm9\" (UniqueName: \"kubernetes.io/projected/ab279091-c87c-4718-8f39-b1de243fb031-kube-api-access-xglm9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wxkz\" (UID: \"ab279091-c87c-4718-8f39-b1de243fb031\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:09.456879 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.456834 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab279091-c87c-4718-8f39-b1de243fb031-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wxkz\" (UID: \"ab279091-c87c-4718-8f39-b1de243fb031\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:09.456928 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.456882 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-phn92\" (UID: \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:09.456990 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.456968 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8k5w\" (UniqueName: \"kubernetes.io/projected/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-kube-api-access-z8k5w\") pod \"kuadrant-operator-controller-manager-84b657d985-phn92\" (UID: \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:09.457213 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.457197 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab279091-c87c-4718-8f39-b1de243fb031-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wxkz\" (UID: \"ab279091-c87c-4718-8f39-b1de243fb031\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:09.471233 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.471209 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglm9\" (UniqueName: \"kubernetes.io/projected/ab279091-c87c-4718-8f39-b1de243fb031-kube-api-access-xglm9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7wxkz\" (UID: \"ab279091-c87c-4718-8f39-b1de243fb031\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:09.558014 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.557992 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2444eb35-d4b7-4324-b30f-ac6a43c96d53-extensions-socket-volume\") pod \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\" (UID: \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\") " Apr 20 14:37:09.558358 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558027 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6f8fc969-659d-4443-94b7-ef15298b76a7-extensions-socket-volume\") pod \"6f8fc969-659d-4443-94b7-ef15298b76a7\" (UID: \"6f8fc969-659d-4443-94b7-ef15298b76a7\") " Apr 20 14:37:09.558358 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558053 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv476\" (UniqueName: \"kubernetes.io/projected/2444eb35-d4b7-4324-b30f-ac6a43c96d53-kube-api-access-sv476\") pod \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\" (UID: \"2444eb35-d4b7-4324-b30f-ac6a43c96d53\") " Apr 20 14:37:09.558358 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558081 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxw57\" (UniqueName: \"kubernetes.io/projected/6f8fc969-659d-4443-94b7-ef15298b76a7-kube-api-access-dxw57\") pod \"6f8fc969-659d-4443-94b7-ef15298b76a7\" (UID: \"6f8fc969-659d-4443-94b7-ef15298b76a7\") " Apr 20 14:37:09.558358 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558230 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2444eb35-d4b7-4324-b30f-ac6a43c96d53-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "2444eb35-d4b7-4324-b30f-ac6a43c96d53" (UID: "2444eb35-d4b7-4324-b30f-ac6a43c96d53"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:37:09.558358 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558262 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-phn92\" (UID: \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:09.558358 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558327 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8k5w\" (UniqueName: \"kubernetes.io/projected/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-kube-api-access-z8k5w\") pod \"kuadrant-operator-controller-manager-84b657d985-phn92\" (UID: \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:09.558580 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558366 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f8fc969-659d-4443-94b7-ef15298b76a7-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6f8fc969-659d-4443-94b7-ef15298b76a7" (UID: "6f8fc969-659d-4443-94b7-ef15298b76a7"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:37:09.558580 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558478 2581 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2444eb35-d4b7-4324-b30f-ac6a43c96d53-extensions-socket-volume\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:37:09.558580 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558500 2581 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6f8fc969-659d-4443-94b7-ef15298b76a7-extensions-socket-volume\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:37:09.558685 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.558613 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-phn92\" (UID: \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:09.560088 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.560070 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8fc969-659d-4443-94b7-ef15298b76a7-kube-api-access-dxw57" (OuterVolumeSpecName: "kube-api-access-dxw57") pod "6f8fc969-659d-4443-94b7-ef15298b76a7" (UID: "6f8fc969-659d-4443-94b7-ef15298b76a7"). InnerVolumeSpecName "kube-api-access-dxw57". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:37:09.560232 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.560211 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2444eb35-d4b7-4324-b30f-ac6a43c96d53-kube-api-access-sv476" (OuterVolumeSpecName: "kube-api-access-sv476") pod "2444eb35-d4b7-4324-b30f-ac6a43c96d53" (UID: "2444eb35-d4b7-4324-b30f-ac6a43c96d53"). InnerVolumeSpecName "kube-api-access-sv476". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:37:09.567615 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.567588 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8k5w\" (UniqueName: \"kubernetes.io/projected/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-kube-api-access-z8k5w\") pod \"kuadrant-operator-controller-manager-84b657d985-phn92\" (UID: \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:09.568363 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.568349 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:09.605678 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.605638 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:09.659874 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.659830 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sv476\" (UniqueName: \"kubernetes.io/projected/2444eb35-d4b7-4324-b30f-ac6a43c96d53-kube-api-access-sv476\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:37:09.659874 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.659854 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dxw57\" (UniqueName: \"kubernetes.io/projected/6f8fc969-659d-4443-94b7-ef15298b76a7-kube-api-access-dxw57\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:37:09.713683 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.713642 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz"] Apr 20 14:37:09.714589 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:37:09.714556 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab279091_c87c_4718_8f39_b1de243fb031.slice/crio-47af4701a8517519db7fc69f411e1805ad37e6b02b150b44364b8d207c62d29b WatchSource:0}: Error finding container 47af4701a8517519db7fc69f411e1805ad37e6b02b150b44364b8d207c62d29b: Status 404 returned error can't find the container with id 47af4701a8517519db7fc69f411e1805ad37e6b02b150b44364b8d207c62d29b Apr 20 14:37:09.743047 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.743020 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92"] Apr 20 14:37:09.747965 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:37:09.747815 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6601a57c_c8e3_47bd_8a7c_e8c6deceae2a.slice/crio-0c276ebcc4d8d66833f22e74247e161f3d90bc1fd64ae71bd6c9205f43b4ebaa WatchSource:0}: Error finding container 0c276ebcc4d8d66833f22e74247e161f3d90bc1fd64ae71bd6c9205f43b4ebaa: Status 404 returned error can't find the container with id 0c276ebcc4d8d66833f22e74247e161f3d90bc1fd64ae71bd6c9205f43b4ebaa Apr 20 14:37:09.750937 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.750905 2581 status_manager.go:895] "Failed to get status for pod" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:09.752810 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:09.752790 2581 status_manager.go:895] "Failed to get status for pod" podUID="2444eb35-d4b7-4324-b30f-ac6a43c96d53" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:10.387306 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.387275 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2444eb35-d4b7-4324-b30f-ac6a43c96d53" path="/var/lib/kubelet/pods/2444eb35-d4b7-4324-b30f-ac6a43c96d53/volumes" Apr 20 14:37:10.387520 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.387506 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" path="/var/lib/kubelet/pods/6f8fc969-659d-4443-94b7-ef15298b76a7/volumes" Apr 20 14:37:10.387879 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.387856 2581 status_manager.go:895] "Failed to get status for pod" podUID="6f8fc969-659d-4443-94b7-ef15298b76a7" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txw6r" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-txw6r\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:10.389565 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.389544 2581 status_manager.go:895] "Failed to get status for pod" podUID="2444eb35-d4b7-4324-b30f-ac6a43c96d53" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:10.434539 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.434512 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" event={"ID":"ab279091-c87c-4718-8f39-b1de243fb031","Type":"ContainerStarted","Data":"72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661"} Apr 20 14:37:10.434539 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.434543 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" event={"ID":"ab279091-c87c-4718-8f39-b1de243fb031","Type":"ContainerStarted","Data":"47af4701a8517519db7fc69f411e1805ad37e6b02b150b44364b8d207c62d29b"} Apr 20 14:37:10.434810 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.434571 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:10.436162 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.436140 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" Apr 20 14:37:10.436162 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.436152 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" event={"ID":"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a","Type":"ContainerStarted","Data":"4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f"} Apr 20 14:37:10.436316 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.436175 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" event={"ID":"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a","Type":"ContainerStarted","Data":"0c276ebcc4d8d66833f22e74247e161f3d90bc1fd64ae71bd6c9205f43b4ebaa"} Apr 20 14:37:10.436316 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.436198 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:10.457937 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.455609 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" podStartSLOduration=1.4555936329999999 podStartE2EDuration="1.455593633s" podCreationTimestamp="2026-04-20 14:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:37:10.4524324 +0000 UTC m=+610.631920869" watchObservedRunningTime="2026-04-20 14:37:10.455593633 +0000 UTC m=+610.635082104" Apr 20 14:37:10.472830 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.472772 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" podStartSLOduration=1.472753328 podStartE2EDuration="1.472753328s" podCreationTimestamp="2026-04-20 14:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:37:10.472235126 +0000 UTC m=+610.651723595" watchObservedRunningTime="2026-04-20 14:37:10.472753328 +0000 UTC m=+610.652241793" Apr 20 14:37:10.474036 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:10.473993 2581 status_manager.go:895] "Failed to get status for pod" podUID="2444eb35-d4b7-4324-b30f-ac6a43c96d53" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-bwmsl\" is forbidden: User \"system:node:ip-10-0-139-136.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-139-136.ec2.internal' and this object" Apr 20 14:37:21.441504 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.441475 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:21.442028 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.441526 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:21.513694 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.513656 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz"] Apr 20 14:37:21.513928 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.513897 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" podUID="ab279091-c87c-4718-8f39-b1de243fb031" containerName="manager" containerID="cri-o://72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661" gracePeriod=10 Apr 20 14:37:21.764082 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.764053 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks"] Apr 20 14:37:21.765409 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.765391 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:21.767467 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.767449 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:21.778480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.778460 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks"] Apr 20 14:37:21.859414 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.859378 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3250d467-6a26-4a86-bca3-2c39b4dd7245-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4hjks\" (UID: \"3250d467-6a26-4a86-bca3-2c39b4dd7245\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:21.859592 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.859433 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ts47\" (UniqueName: \"kubernetes.io/projected/3250d467-6a26-4a86-bca3-2c39b4dd7245-kube-api-access-5ts47\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4hjks\" (UID: \"3250d467-6a26-4a86-bca3-2c39b4dd7245\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:21.960685 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.960650 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xglm9\" (UniqueName: \"kubernetes.io/projected/ab279091-c87c-4718-8f39-b1de243fb031-kube-api-access-xglm9\") pod \"ab279091-c87c-4718-8f39-b1de243fb031\" (UID: \"ab279091-c87c-4718-8f39-b1de243fb031\") " Apr 20 14:37:21.960945 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.960761 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab279091-c87c-4718-8f39-b1de243fb031-extensions-socket-volume\") pod \"ab279091-c87c-4718-8f39-b1de243fb031\" (UID: \"ab279091-c87c-4718-8f39-b1de243fb031\") " Apr 20 14:37:21.960945 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.960841 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3250d467-6a26-4a86-bca3-2c39b4dd7245-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4hjks\" (UID: \"3250d467-6a26-4a86-bca3-2c39b4dd7245\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:21.960945 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.960874 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ts47\" (UniqueName: \"kubernetes.io/projected/3250d467-6a26-4a86-bca3-2c39b4dd7245-kube-api-access-5ts47\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4hjks\" (UID: \"3250d467-6a26-4a86-bca3-2c39b4dd7245\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:21.961202 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.961176 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab279091-c87c-4718-8f39-b1de243fb031-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "ab279091-c87c-4718-8f39-b1de243fb031" (UID: "ab279091-c87c-4718-8f39-b1de243fb031"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:37:21.961258 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.961245 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3250d467-6a26-4a86-bca3-2c39b4dd7245-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4hjks\" (UID: \"3250d467-6a26-4a86-bca3-2c39b4dd7245\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:21.962894 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.962873 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab279091-c87c-4718-8f39-b1de243fb031-kube-api-access-xglm9" (OuterVolumeSpecName: "kube-api-access-xglm9") pod "ab279091-c87c-4718-8f39-b1de243fb031" (UID: "ab279091-c87c-4718-8f39-b1de243fb031"). InnerVolumeSpecName "kube-api-access-xglm9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:37:21.969494 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:21.969464 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ts47\" (UniqueName: \"kubernetes.io/projected/3250d467-6a26-4a86-bca3-2c39b4dd7245-kube-api-access-5ts47\") pod \"kuadrant-operator-controller-manager-55c7f4c975-4hjks\" (UID: \"3250d467-6a26-4a86-bca3-2c39b4dd7245\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:22.061372 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.061306 2581 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab279091-c87c-4718-8f39-b1de243fb031-extensions-socket-volume\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:37:22.061372 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.061332 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xglm9\" (UniqueName: \"kubernetes.io/projected/ab279091-c87c-4718-8f39-b1de243fb031-kube-api-access-xglm9\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:37:22.077661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.077631 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:22.411110 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.411087 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks"] Apr 20 14:37:22.413424 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:37:22.413400 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3250d467_6a26_4a86_bca3_2c39b4dd7245.slice/crio-39f192415eb0096e71400aa8622cd1fd393e9eb19c8cab5c4ee0e3130494bc69 WatchSource:0}: Error finding container 39f192415eb0096e71400aa8622cd1fd393e9eb19c8cab5c4ee0e3130494bc69: Status 404 returned error can't find the container with id 39f192415eb0096e71400aa8622cd1fd393e9eb19c8cab5c4ee0e3130494bc69 Apr 20 14:37:22.415492 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.415477 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:37:22.481075 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.481040 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" event={"ID":"3250d467-6a26-4a86-bca3-2c39b4dd7245","Type":"ContainerStarted","Data":"39f192415eb0096e71400aa8622cd1fd393e9eb19c8cab5c4ee0e3130494bc69"} Apr 20 14:37:22.482248 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.482222 2581 generic.go:358] "Generic (PLEG): container finished" podID="ab279091-c87c-4718-8f39-b1de243fb031" containerID="72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661" exitCode=0 Apr 20 14:37:22.482346 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.482296 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" Apr 20 14:37:22.482388 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.482296 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" event={"ID":"ab279091-c87c-4718-8f39-b1de243fb031","Type":"ContainerDied","Data":"72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661"} Apr 20 14:37:22.482423 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.482398 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz" event={"ID":"ab279091-c87c-4718-8f39-b1de243fb031","Type":"ContainerDied","Data":"47af4701a8517519db7fc69f411e1805ad37e6b02b150b44364b8d207c62d29b"} Apr 20 14:37:22.482423 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.482415 2581 scope.go:117] "RemoveContainer" containerID="72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661" Apr 20 14:37:22.490550 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.490530 2581 scope.go:117] "RemoveContainer" containerID="72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661" Apr 20 14:37:22.490799 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:37:22.490781 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661\": container with ID starting with 72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661 not found: ID does not exist" containerID="72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661" Apr 20 14:37:22.490876 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.490806 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661"} err="failed to get container status \"72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661\": rpc error: code = NotFound desc = could not find container \"72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661\": container with ID starting with 72c0ab822fdaeb8d466b308a3918da70d2fb0b65cb18b7f38a959de77e6c8661 not found: ID does not exist" Apr 20 14:37:22.498880 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.498861 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz"] Apr 20 14:37:22.506693 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:22.506667 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7wxkz"] Apr 20 14:37:23.487537 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:23.487501 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" event={"ID":"3250d467-6a26-4a86-bca3-2c39b4dd7245","Type":"ContainerStarted","Data":"3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add"} Apr 20 14:37:23.487963 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:23.487592 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:23.509131 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:23.509092 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" podStartSLOduration=2.509080941 podStartE2EDuration="2.509080941s" podCreationTimestamp="2026-04-20 14:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:37:23.505914889 +0000 UTC m=+623.685403356" watchObservedRunningTime="2026-04-20 14:37:23.509080941 +0000 UTC m=+623.688569409" Apr 20 14:37:24.388504 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:24.388474 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab279091-c87c-4718-8f39-b1de243fb031" path="/var/lib/kubelet/pods/ab279091-c87c-4718-8f39-b1de243fb031/volumes" Apr 20 14:37:34.494523 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.494496 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:37:34.545622 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.545588 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92"] Apr 20 14:37:34.545909 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.545885 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" podUID="6601a57c-c8e3-47bd-8a7c-e8c6deceae2a" containerName="manager" containerID="cri-o://4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f" gracePeriod=10 Apr 20 14:37:34.787245 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.787224 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:34.861706 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.861683 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-extensions-socket-volume\") pod \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\" (UID: \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\") " Apr 20 14:37:34.861848 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.861752 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8k5w\" (UniqueName: \"kubernetes.io/projected/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-kube-api-access-z8k5w\") pod \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\" (UID: \"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a\") " Apr 20 14:37:34.862040 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.862017 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6601a57c-c8e3-47bd-8a7c-e8c6deceae2a" (UID: "6601a57c-c8e3-47bd-8a7c-e8c6deceae2a"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:37:34.863831 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.863814 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-kube-api-access-z8k5w" (OuterVolumeSpecName: "kube-api-access-z8k5w") pod "6601a57c-c8e3-47bd-8a7c-e8c6deceae2a" (UID: "6601a57c-c8e3-47bd-8a7c-e8c6deceae2a"). InnerVolumeSpecName "kube-api-access-z8k5w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:37:34.963115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.963090 2581 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-extensions-socket-volume\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:37:34.963115 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:34.963115 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8k5w\" (UniqueName: \"kubernetes.io/projected/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a-kube-api-access-z8k5w\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:37:35.535198 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.535168 2581 generic.go:358] "Generic (PLEG): container finished" podID="6601a57c-c8e3-47bd-8a7c-e8c6deceae2a" containerID="4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f" exitCode=0 Apr 20 14:37:35.535581 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.535229 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" Apr 20 14:37:35.535581 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.535230 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" event={"ID":"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a","Type":"ContainerDied","Data":"4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f"} Apr 20 14:37:35.535581 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.535328 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92" event={"ID":"6601a57c-c8e3-47bd-8a7c-e8c6deceae2a","Type":"ContainerDied","Data":"0c276ebcc4d8d66833f22e74247e161f3d90bc1fd64ae71bd6c9205f43b4ebaa"} Apr 20 14:37:35.535581 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.535342 2581 scope.go:117] "RemoveContainer" containerID="4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f" Apr 20 14:37:35.544807 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.544788 2581 scope.go:117] "RemoveContainer" containerID="4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f" Apr 20 14:37:35.545075 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:37:35.545055 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f\": container with ID starting with 4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f not found: ID does not exist" containerID="4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f" Apr 20 14:37:35.545129 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.545083 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f"} err="failed to get container status \"4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f\": rpc error: code = NotFound desc = could not find container \"4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f\": container with ID starting with 4eb1dd8db55d78ac8b86ceab275d3942f256919a20569b942d67987e741d086f not found: ID does not exist" Apr 20 14:37:35.560215 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.560186 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92"] Apr 20 14:37:35.563778 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:35.563755 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-phn92"] Apr 20 14:37:36.387982 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:36.387950 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6601a57c-c8e3-47bd-8a7c-e8c6deceae2a" path="/var/lib/kubelet/pods/6601a57c-c8e3-47bd-8a7c-e8c6deceae2a/volumes" Apr 20 14:37:51.614778 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.614745 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fsfns"] Apr 20 14:37:51.615190 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.615059 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab279091-c87c-4718-8f39-b1de243fb031" containerName="manager" Apr 20 14:37:51.615190 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.615070 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab279091-c87c-4718-8f39-b1de243fb031" containerName="manager" Apr 20 14:37:51.615190 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.615095 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6601a57c-c8e3-47bd-8a7c-e8c6deceae2a" containerName="manager" Apr 20 14:37:51.615190 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.615102 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="6601a57c-c8e3-47bd-8a7c-e8c6deceae2a" containerName="manager" Apr 20 14:37:51.615190 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.615150 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab279091-c87c-4718-8f39-b1de243fb031" containerName="manager" Apr 20 14:37:51.615190 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.615162 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="6601a57c-c8e3-47bd-8a7c-e8c6deceae2a" containerName="manager" Apr 20 14:37:51.618175 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.618159 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:51.620451 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.620424 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 14:37:51.620590 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.620572 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jksts\"" Apr 20 14:37:51.624688 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.624654 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fsfns"] Apr 20 14:37:51.691592 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.691558 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-config-file\") pod \"limitador-limitador-7d549b5b-fsfns\" (UID: \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:51.691775 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.691637 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvwd\" (UniqueName: \"kubernetes.io/projected/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-kube-api-access-hpvwd\") pod \"limitador-limitador-7d549b5b-fsfns\" (UID: \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:51.711618 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.711586 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fsfns"] Apr 20 14:37:51.792900 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.792873 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-config-file\") pod \"limitador-limitador-7d549b5b-fsfns\" (UID: \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:51.793064 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.792918 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvwd\" (UniqueName: \"kubernetes.io/projected/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-kube-api-access-hpvwd\") pod \"limitador-limitador-7d549b5b-fsfns\" (UID: \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:51.793456 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.793437 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-config-file\") pod \"limitador-limitador-7d549b5b-fsfns\" (UID: \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:51.801292 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.801264 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvwd\" (UniqueName: \"kubernetes.io/projected/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-kube-api-access-hpvwd\") pod \"limitador-limitador-7d549b5b-fsfns\" (UID: \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:51.929470 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:51.929399 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:52.064977 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:52.064950 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fsfns"] Apr 20 14:37:52.067064 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:37:52.067036 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b7a1543_7fef_4cf5_ad43_7018ad5c7452.slice/crio-378ae5e1f5c9e03a7e22a73c2e9b2cb2073a576928f752dd696388ef2bf037ce WatchSource:0}: Error finding container 378ae5e1f5c9e03a7e22a73c2e9b2cb2073a576928f752dd696388ef2bf037ce: Status 404 returned error can't find the container with id 378ae5e1f5c9e03a7e22a73c2e9b2cb2073a576928f752dd696388ef2bf037ce Apr 20 14:37:52.603336 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:52.603303 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" event={"ID":"2b7a1543-7fef-4cf5-ad43-7018ad5c7452","Type":"ContainerStarted","Data":"378ae5e1f5c9e03a7e22a73c2e9b2cb2073a576928f752dd696388ef2bf037ce"} Apr 20 14:37:55.616831 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:55.616788 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" event={"ID":"2b7a1543-7fef-4cf5-ad43-7018ad5c7452","Type":"ContainerStarted","Data":"10b2e5f08a0e86216d3136e1eb5613f54c0ff057af56870987f15c7fa88bbacc"} Apr 20 14:37:55.617266 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:55.616952 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:37:55.632644 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:37:55.632594 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" podStartSLOduration=1.9875602510000001 podStartE2EDuration="4.632580724s" podCreationTimestamp="2026-04-20 14:37:51 +0000 UTC" firstStartedPulling="2026-04-20 14:37:52.068806204 +0000 UTC m=+652.248294650" lastFinishedPulling="2026-04-20 14:37:54.713826662 +0000 UTC m=+654.893315123" observedRunningTime="2026-04-20 14:37:55.631495022 +0000 UTC m=+655.810983490" watchObservedRunningTime="2026-04-20 14:37:55.632580724 +0000 UTC m=+655.812069237" Apr 20 14:38:06.232624 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.232591 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fsfns"] Apr 20 14:38:06.233135 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.232842 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" podUID="2b7a1543-7fef-4cf5-ad43-7018ad5c7452" containerName="limitador" containerID="cri-o://10b2e5f08a0e86216d3136e1eb5613f54c0ff057af56870987f15c7fa88bbacc" gracePeriod=30 Apr 20 14:38:06.233437 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.233419 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:38:06.662137 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.662102 2581 generic.go:358] "Generic (PLEG): container finished" podID="2b7a1543-7fef-4cf5-ad43-7018ad5c7452" containerID="10b2e5f08a0e86216d3136e1eb5613f54c0ff057af56870987f15c7fa88bbacc" exitCode=0 Apr 20 14:38:06.662236 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.662169 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" event={"ID":"2b7a1543-7fef-4cf5-ad43-7018ad5c7452","Type":"ContainerDied","Data":"10b2e5f08a0e86216d3136e1eb5613f54c0ff057af56870987f15c7fa88bbacc"} Apr 20 14:38:06.780944 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.780925 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:38:06.913277 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.913212 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-config-file\") pod \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\" (UID: \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\") " Apr 20 14:38:06.913399 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.913306 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvwd\" (UniqueName: \"kubernetes.io/projected/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-kube-api-access-hpvwd\") pod \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\" (UID: \"2b7a1543-7fef-4cf5-ad43-7018ad5c7452\") " Apr 20 14:38:06.913549 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.913525 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-config-file" (OuterVolumeSpecName: "config-file") pod "2b7a1543-7fef-4cf5-ad43-7018ad5c7452" (UID: "2b7a1543-7fef-4cf5-ad43-7018ad5c7452"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:38:06.915508 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:06.915485 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-kube-api-access-hpvwd" (OuterVolumeSpecName: "kube-api-access-hpvwd") pod "2b7a1543-7fef-4cf5-ad43-7018ad5c7452" (UID: "2b7a1543-7fef-4cf5-ad43-7018ad5c7452"). InnerVolumeSpecName "kube-api-access-hpvwd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:38:07.014550 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:07.014525 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hpvwd\" (UniqueName: \"kubernetes.io/projected/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-kube-api-access-hpvwd\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:38:07.014550 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:07.014547 2581 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2b7a1543-7fef-4cf5-ad43-7018ad5c7452-config-file\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:38:07.667404 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:07.667372 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" Apr 20 14:38:07.667847 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:07.667397 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fsfns" event={"ID":"2b7a1543-7fef-4cf5-ad43-7018ad5c7452","Type":"ContainerDied","Data":"378ae5e1f5c9e03a7e22a73c2e9b2cb2073a576928f752dd696388ef2bf037ce"} Apr 20 14:38:07.667847 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:07.667446 2581 scope.go:117] "RemoveContainer" containerID="10b2e5f08a0e86216d3136e1eb5613f54c0ff057af56870987f15c7fa88bbacc" Apr 20 14:38:07.689022 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:07.688997 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fsfns"] Apr 20 14:38:07.690979 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:07.690958 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fsfns"] Apr 20 14:38:08.387737 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:08.387699 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7a1543-7fef-4cf5-ad43-7018ad5c7452" path="/var/lib/kubelet/pods/2b7a1543-7fef-4cf5-ad43-7018ad5c7452/volumes" Apr 20 14:38:09.049995 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.049962 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-5kr8w"] Apr 20 14:38:09.050554 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.050518 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b7a1543-7fef-4cf5-ad43-7018ad5c7452" containerName="limitador" Apr 20 14:38:09.050554 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.050545 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7a1543-7fef-4cf5-ad43-7018ad5c7452" containerName="limitador" Apr 20 14:38:09.050842 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.050681 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b7a1543-7fef-4cf5-ad43-7018ad5c7452" containerName="limitador" Apr 20 14:38:09.054103 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.054081 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:09.056401 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.056381 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 14:38:09.056515 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.056455 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-gbkdk\"" Apr 20 14:38:09.061182 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.060970 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-5kr8w"] Apr 20 14:38:09.130488 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.130460 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72vzd\" (UniqueName: \"kubernetes.io/projected/f42738c5-6c35-4d09-aa1d-ae5ef4aae006-kube-api-access-72vzd\") pod \"postgres-868db5846d-5kr8w\" (UID: \"f42738c5-6c35-4d09-aa1d-ae5ef4aae006\") " pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:09.130590 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.130508 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f42738c5-6c35-4d09-aa1d-ae5ef4aae006-data\") pod \"postgres-868db5846d-5kr8w\" (UID: \"f42738c5-6c35-4d09-aa1d-ae5ef4aae006\") " pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:09.231099 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.231072 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72vzd\" (UniqueName: \"kubernetes.io/projected/f42738c5-6c35-4d09-aa1d-ae5ef4aae006-kube-api-access-72vzd\") pod \"postgres-868db5846d-5kr8w\" (UID: \"f42738c5-6c35-4d09-aa1d-ae5ef4aae006\") " pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:09.231213 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.231118 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f42738c5-6c35-4d09-aa1d-ae5ef4aae006-data\") pod \"postgres-868db5846d-5kr8w\" (UID: \"f42738c5-6c35-4d09-aa1d-ae5ef4aae006\") " pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:09.231458 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.231442 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f42738c5-6c35-4d09-aa1d-ae5ef4aae006-data\") pod \"postgres-868db5846d-5kr8w\" (UID: \"f42738c5-6c35-4d09-aa1d-ae5ef4aae006\") " pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:09.238987 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.238962 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72vzd\" (UniqueName: \"kubernetes.io/projected/f42738c5-6c35-4d09-aa1d-ae5ef4aae006-kube-api-access-72vzd\") pod \"postgres-868db5846d-5kr8w\" (UID: \"f42738c5-6c35-4d09-aa1d-ae5ef4aae006\") " pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:09.367252 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.367197 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:09.488116 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.488088 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-5kr8w"] Apr 20 14:38:09.489670 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:38:09.489624 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf42738c5_6c35_4d09_aa1d_ae5ef4aae006.slice/crio-b5e3479c1256ff0d30e5841fec51aae943d0acfdc58f636cbcc7fc0bdf458595 WatchSource:0}: Error finding container b5e3479c1256ff0d30e5841fec51aae943d0acfdc58f636cbcc7fc0bdf458595: Status 404 returned error can't find the container with id b5e3479c1256ff0d30e5841fec51aae943d0acfdc58f636cbcc7fc0bdf458595 Apr 20 14:38:09.678004 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:09.677938 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-5kr8w" event={"ID":"f42738c5-6c35-4d09-aa1d-ae5ef4aae006","Type":"ContainerStarted","Data":"b5e3479c1256ff0d30e5841fec51aae943d0acfdc58f636cbcc7fc0bdf458595"} Apr 20 14:38:14.701706 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:14.701622 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-5kr8w" event={"ID":"f42738c5-6c35-4d09-aa1d-ae5ef4aae006","Type":"ContainerStarted","Data":"ae5f952991def9237161d5c00980a5651f722834b49bfcc6fe5f7fbbce517384"} Apr 20 14:38:14.702118 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:14.701748 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:14.720156 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:14.720112 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-5kr8w" podStartSLOduration=0.877658508 podStartE2EDuration="5.720100872s" podCreationTimestamp="2026-04-20 14:38:09 +0000 UTC" firstStartedPulling="2026-04-20 14:38:09.490875687 +0000 UTC m=+669.670364133" lastFinishedPulling="2026-04-20 14:38:14.33331805 +0000 UTC m=+674.512806497" observedRunningTime="2026-04-20 14:38:14.718442659 +0000 UTC m=+674.897931128" watchObservedRunningTime="2026-04-20 14:38:14.720100872 +0000 UTC m=+674.899589340" Apr 20 14:38:20.733860 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:20.733831 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-5kr8w" Apr 20 14:38:24.396744 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.396697 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gvcz"] Apr 20 14:38:24.572947 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.572916 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gvcz"] Apr 20 14:38:24.572947 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.572947 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7d784fcc97-pxk9x"] Apr 20 14:38:24.573132 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.573099 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2gvcz" Apr 20 14:38:24.575429 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.575407 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-sc5bg\"" Apr 20 14:38:24.596417 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.596396 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7d784fcc97-pxk9x"] Apr 20 14:38:24.596530 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.596493 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:24.658553 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.658498 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f6sv\" (UniqueName: \"kubernetes.io/projected/e32e27f6-a966-43c2-aa38-2af928d2aca0-kube-api-access-4f6sv\") pod \"maas-controller-6d4c8f55f9-2gvcz\" (UID: \"e32e27f6-a966-43c2-aa38-2af928d2aca0\") " pod="opendatahub/maas-controller-6d4c8f55f9-2gvcz" Apr 20 14:38:24.667580 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.667548 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gvcz"] Apr 20 14:38:24.667769 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:38:24.667751 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-4f6sv], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-6d4c8f55f9-2gvcz" podUID="e32e27f6-a966-43c2-aa38-2af928d2aca0" Apr 20 14:38:24.690128 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.690101 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-54cb8f5bbb-v2jbv"] Apr 20 14:38:24.741282 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.741254 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54cb8f5bbb-v2jbv"] Apr 20 14:38:24.741405 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.741345 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2gvcz" Apr 20 14:38:24.741405 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.741367 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:24.745682 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.745665 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2gvcz" Apr 20 14:38:24.758938 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.758900 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f6sv\" (UniqueName: \"kubernetes.io/projected/e32e27f6-a966-43c2-aa38-2af928d2aca0-kube-api-access-4f6sv\") pod \"maas-controller-6d4c8f55f9-2gvcz\" (UID: \"e32e27f6-a966-43c2-aa38-2af928d2aca0\") " pod="opendatahub/maas-controller-6d4c8f55f9-2gvcz" Apr 20 14:38:24.759026 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.758963 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4hz2\" (UniqueName: \"kubernetes.io/projected/405f2c62-fdef-4490-af87-b029c9f4b6cd-kube-api-access-z4hz2\") pod \"maas-controller-7d784fcc97-pxk9x\" (UID: \"405f2c62-fdef-4490-af87-b029c9f4b6cd\") " pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:24.767163 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.767143 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f6sv\" (UniqueName: \"kubernetes.io/projected/e32e27f6-a966-43c2-aa38-2af928d2aca0-kube-api-access-4f6sv\") pod \"maas-controller-6d4c8f55f9-2gvcz\" (UID: \"e32e27f6-a966-43c2-aa38-2af928d2aca0\") " pod="opendatahub/maas-controller-6d4c8f55f9-2gvcz" Apr 20 14:38:24.859309 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.859289 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f6sv\" (UniqueName: \"kubernetes.io/projected/e32e27f6-a966-43c2-aa38-2af928d2aca0-kube-api-access-4f6sv\") pod \"e32e27f6-a966-43c2-aa38-2af928d2aca0\" (UID: \"e32e27f6-a966-43c2-aa38-2af928d2aca0\") " Apr 20 14:38:24.859409 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.859395 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4hz2\" (UniqueName: \"kubernetes.io/projected/405f2c62-fdef-4490-af87-b029c9f4b6cd-kube-api-access-z4hz2\") pod \"maas-controller-7d784fcc97-pxk9x\" (UID: \"405f2c62-fdef-4490-af87-b029c9f4b6cd\") " pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:24.859449 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.859422 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjtxx\" (UniqueName: \"kubernetes.io/projected/5229efcb-1902-4240-a871-37ce504bdc16-kube-api-access-mjtxx\") pod \"maas-controller-54cb8f5bbb-v2jbv\" (UID: \"5229efcb-1902-4240-a871-37ce504bdc16\") " pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:24.861283 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.861262 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32e27f6-a966-43c2-aa38-2af928d2aca0-kube-api-access-4f6sv" (OuterVolumeSpecName: "kube-api-access-4f6sv") pod "e32e27f6-a966-43c2-aa38-2af928d2aca0" (UID: "e32e27f6-a966-43c2-aa38-2af928d2aca0"). InnerVolumeSpecName "kube-api-access-4f6sv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:38:24.869600 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.869580 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4hz2\" (UniqueName: \"kubernetes.io/projected/405f2c62-fdef-4490-af87-b029c9f4b6cd-kube-api-access-z4hz2\") pod \"maas-controller-7d784fcc97-pxk9x\" (UID: \"405f2c62-fdef-4490-af87-b029c9f4b6cd\") " pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:24.905439 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.905419 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:24.960557 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.960522 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjtxx\" (UniqueName: \"kubernetes.io/projected/5229efcb-1902-4240-a871-37ce504bdc16-kube-api-access-mjtxx\") pod \"maas-controller-54cb8f5bbb-v2jbv\" (UID: \"5229efcb-1902-4240-a871-37ce504bdc16\") " pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:24.960683 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.960599 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4f6sv\" (UniqueName: \"kubernetes.io/projected/e32e27f6-a966-43c2-aa38-2af928d2aca0-kube-api-access-4f6sv\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:38:24.970134 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:24.970107 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjtxx\" (UniqueName: \"kubernetes.io/projected/5229efcb-1902-4240-a871-37ce504bdc16-kube-api-access-mjtxx\") pod \"maas-controller-54cb8f5bbb-v2jbv\" (UID: \"5229efcb-1902-4240-a871-37ce504bdc16\") " pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:25.024781 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:25.024755 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7d784fcc97-pxk9x"] Apr 20 14:38:25.027085 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:38:25.027058 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod405f2c62_fdef_4490_af87_b029c9f4b6cd.slice/crio-010a0d1803d0a05cba870e004d44275a6614448ead96e498209b0fc8e2dfde6a WatchSource:0}: Error finding container 010a0d1803d0a05cba870e004d44275a6614448ead96e498209b0fc8e2dfde6a: Status 404 returned error can't find the container with id 010a0d1803d0a05cba870e004d44275a6614448ead96e498209b0fc8e2dfde6a Apr 20 14:38:25.054395 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:25.054370 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:25.173491 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:25.173462 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54cb8f5bbb-v2jbv"] Apr 20 14:38:25.174976 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:38:25.174945 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5229efcb_1902_4240_a871_37ce504bdc16.slice/crio-6b2db3480bc5b6c5a0059de47c479d9a05d3c528e5d60759d9775b68cd4b4bfe WatchSource:0}: Error finding container 6b2db3480bc5b6c5a0059de47c479d9a05d3c528e5d60759d9775b68cd4b4bfe: Status 404 returned error can't find the container with id 6b2db3480bc5b6c5a0059de47c479d9a05d3c528e5d60759d9775b68cd4b4bfe Apr 20 14:38:25.753582 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:25.753520 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" event={"ID":"5229efcb-1902-4240-a871-37ce504bdc16","Type":"ContainerStarted","Data":"6b2db3480bc5b6c5a0059de47c479d9a05d3c528e5d60759d9775b68cd4b4bfe"} Apr 20 14:38:25.758027 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:25.757202 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2gvcz" Apr 20 14:38:25.758027 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:25.757346 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" event={"ID":"405f2c62-fdef-4490-af87-b029c9f4b6cd","Type":"ContainerStarted","Data":"010a0d1803d0a05cba870e004d44275a6614448ead96e498209b0fc8e2dfde6a"} Apr 20 14:38:25.809270 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:25.809219 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gvcz"] Apr 20 14:38:25.812737 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:25.812701 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gvcz"] Apr 20 14:38:26.390032 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:26.389798 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e32e27f6-a966-43c2-aa38-2af928d2aca0" path="/var/lib/kubelet/pods/e32e27f6-a966-43c2-aa38-2af928d2aca0/volumes" Apr 20 14:38:28.769512 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:28.769469 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" event={"ID":"5229efcb-1902-4240-a871-37ce504bdc16","Type":"ContainerStarted","Data":"6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04"} Apr 20 14:38:28.769987 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:28.769523 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:28.770893 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:28.770869 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" event={"ID":"405f2c62-fdef-4490-af87-b029c9f4b6cd","Type":"ContainerStarted","Data":"3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9"} Apr 20 14:38:28.771044 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:28.771026 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:28.789442 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:28.789388 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" podStartSLOduration=2.261430116 podStartE2EDuration="4.789375176s" podCreationTimestamp="2026-04-20 14:38:24 +0000 UTC" firstStartedPulling="2026-04-20 14:38:25.176359249 +0000 UTC m=+685.355847695" lastFinishedPulling="2026-04-20 14:38:27.704304307 +0000 UTC m=+687.883792755" observedRunningTime="2026-04-20 14:38:28.787243068 +0000 UTC m=+688.966731536" watchObservedRunningTime="2026-04-20 14:38:28.789375176 +0000 UTC m=+688.968863643" Apr 20 14:38:28.809797 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:28.809759 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" podStartSLOduration=2.143926173 podStartE2EDuration="4.809747872s" podCreationTimestamp="2026-04-20 14:38:24 +0000 UTC" firstStartedPulling="2026-04-20 14:38:25.028304394 +0000 UTC m=+685.207792839" lastFinishedPulling="2026-04-20 14:38:27.694126078 +0000 UTC m=+687.873614538" observedRunningTime="2026-04-20 14:38:28.808509374 +0000 UTC m=+688.987997844" watchObservedRunningTime="2026-04-20 14:38:28.809747872 +0000 UTC m=+688.989236339" Apr 20 14:38:29.973599 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:29.973565 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-84d488ddb8-6wh6w"] Apr 20 14:38:29.977055 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:29.977038 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:29.979490 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:29.979463 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 14:38:29.979674 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:29.979490 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 14:38:29.979674 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:29.979494 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-5vm9x\"" Apr 20 14:38:29.985792 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:29.985767 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-84d488ddb8-6wh6w"] Apr 20 14:38:30.105883 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.105854 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-maas-api-tls\") pod \"maas-api-84d488ddb8-6wh6w\" (UID: \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\") " pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:30.106051 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.105913 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tjtc\" (UniqueName: \"kubernetes.io/projected/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-kube-api-access-5tjtc\") pod \"maas-api-84d488ddb8-6wh6w\" (UID: \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\") " pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:30.206612 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.206583 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-maas-api-tls\") pod \"maas-api-84d488ddb8-6wh6w\" (UID: \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\") " pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:30.206807 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.206643 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tjtc\" (UniqueName: \"kubernetes.io/projected/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-kube-api-access-5tjtc\") pod \"maas-api-84d488ddb8-6wh6w\" (UID: \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\") " pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:30.209519 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.209491 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-maas-api-tls\") pod \"maas-api-84d488ddb8-6wh6w\" (UID: \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\") " pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:30.214372 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.214346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tjtc\" (UniqueName: \"kubernetes.io/projected/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-kube-api-access-5tjtc\") pod \"maas-api-84d488ddb8-6wh6w\" (UID: \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\") " pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:30.290370 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.290288 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:30.430983 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.430953 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-84d488ddb8-6wh6w"] Apr 20 14:38:30.433911 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:38:30.433872 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda19bbbaf_154d_4aa9_825c_b4cb25ab9975.slice/crio-d8d8c078346e67f33fa0720b76b42e7960837d722173b37e87bcca56cf05426d WatchSource:0}: Error finding container d8d8c078346e67f33fa0720b76b42e7960837d722173b37e87bcca56cf05426d: Status 404 returned error can't find the container with id d8d8c078346e67f33fa0720b76b42e7960837d722173b37e87bcca56cf05426d Apr 20 14:38:30.779895 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:30.779852 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-84d488ddb8-6wh6w" event={"ID":"a19bbbaf-154d-4aa9-825c-b4cb25ab9975","Type":"ContainerStarted","Data":"d8d8c078346e67f33fa0720b76b42e7960837d722173b37e87bcca56cf05426d"} Apr 20 14:38:32.791671 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:32.791583 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-84d488ddb8-6wh6w" event={"ID":"a19bbbaf-154d-4aa9-825c-b4cb25ab9975","Type":"ContainerStarted","Data":"d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436"} Apr 20 14:38:32.792094 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:32.791671 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:32.807975 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:32.807932 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-84d488ddb8-6wh6w" podStartSLOduration=1.735135882 podStartE2EDuration="3.807918957s" podCreationTimestamp="2026-04-20 14:38:29 +0000 UTC" firstStartedPulling="2026-04-20 14:38:30.435426459 +0000 UTC m=+690.614914906" lastFinishedPulling="2026-04-20 14:38:32.508209534 +0000 UTC m=+692.687697981" observedRunningTime="2026-04-20 14:38:32.806856401 +0000 UTC m=+692.986344870" watchObservedRunningTime="2026-04-20 14:38:32.807918957 +0000 UTC m=+692.987407457" Apr 20 14:38:38.801039 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:38.800964 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:38:39.780257 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:39.780229 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:39.780432 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:39.780279 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:39.835439 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:39.835410 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7d784fcc97-pxk9x"] Apr 20 14:38:39.835867 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:39.835618 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" podUID="405f2c62-fdef-4490-af87-b029c9f4b6cd" containerName="manager" containerID="cri-o://3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9" gracePeriod=10 Apr 20 14:38:40.075260 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.075235 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:40.082526 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.082508 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4hz2\" (UniqueName: \"kubernetes.io/projected/405f2c62-fdef-4490-af87-b029c9f4b6cd-kube-api-access-z4hz2\") pod \"405f2c62-fdef-4490-af87-b029c9f4b6cd\" (UID: \"405f2c62-fdef-4490-af87-b029c9f4b6cd\") " Apr 20 14:38:40.084649 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.084628 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405f2c62-fdef-4490-af87-b029c9f4b6cd-kube-api-access-z4hz2" (OuterVolumeSpecName: "kube-api-access-z4hz2") pod "405f2c62-fdef-4490-af87-b029c9f4b6cd" (UID: "405f2c62-fdef-4490-af87-b029c9f4b6cd"). InnerVolumeSpecName "kube-api-access-z4hz2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:38:40.133161 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.133119 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-79cb8b9576-tlf2w"] Apr 20 14:38:40.133529 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.133514 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="405f2c62-fdef-4490-af87-b029c9f4b6cd" containerName="manager" Apr 20 14:38:40.133574 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.133531 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="405f2c62-fdef-4490-af87-b029c9f4b6cd" containerName="manager" Apr 20 14:38:40.133606 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.133592 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="405f2c62-fdef-4490-af87-b029c9f4b6cd" containerName="manager" Apr 20 14:38:40.136756 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.136717 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:38:40.142494 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.142466 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-79cb8b9576-tlf2w"] Apr 20 14:38:40.183156 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.183126 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm8hj\" (UniqueName: \"kubernetes.io/projected/3c816bda-b0d3-483b-a123-96d13cf52d34-kube-api-access-fm8hj\") pod \"maas-controller-79cb8b9576-tlf2w\" (UID: \"3c816bda-b0d3-483b-a123-96d13cf52d34\") " pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:38:40.183272 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.183213 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4hz2\" (UniqueName: \"kubernetes.io/projected/405f2c62-fdef-4490-af87-b029c9f4b6cd-kube-api-access-z4hz2\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:38:40.283935 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.283905 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fm8hj\" (UniqueName: \"kubernetes.io/projected/3c816bda-b0d3-483b-a123-96d13cf52d34-kube-api-access-fm8hj\") pod \"maas-controller-79cb8b9576-tlf2w\" (UID: \"3c816bda-b0d3-483b-a123-96d13cf52d34\") " pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:38:40.292380 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.292357 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm8hj\" (UniqueName: \"kubernetes.io/projected/3c816bda-b0d3-483b-a123-96d13cf52d34-kube-api-access-fm8hj\") pod \"maas-controller-79cb8b9576-tlf2w\" (UID: \"3c816bda-b0d3-483b-a123-96d13cf52d34\") " pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:38:40.449383 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.449352 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:38:40.578478 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.578452 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-79cb8b9576-tlf2w"] Apr 20 14:38:40.580137 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:38:40.580105 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c816bda_b0d3_483b_a123_96d13cf52d34.slice/crio-1a2661eaf8a0ef664b5d97d9a6781ef657d7805c99837573f73a3c2a79be9445 WatchSource:0}: Error finding container 1a2661eaf8a0ef664b5d97d9a6781ef657d7805c99837573f73a3c2a79be9445: Status 404 returned error can't find the container with id 1a2661eaf8a0ef664b5d97d9a6781ef657d7805c99837573f73a3c2a79be9445 Apr 20 14:38:40.822535 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.822497 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" event={"ID":"3c816bda-b0d3-483b-a123-96d13cf52d34","Type":"ContainerStarted","Data":"1a2661eaf8a0ef664b5d97d9a6781ef657d7805c99837573f73a3c2a79be9445"} Apr 20 14:38:40.823653 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.823625 2581 generic.go:358] "Generic (PLEG): container finished" podID="405f2c62-fdef-4490-af87-b029c9f4b6cd" containerID="3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9" exitCode=0 Apr 20 14:38:40.823829 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.823685 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" event={"ID":"405f2c62-fdef-4490-af87-b029c9f4b6cd","Type":"ContainerDied","Data":"3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9"} Apr 20 14:38:40.823829 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.823708 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" Apr 20 14:38:40.823829 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.823715 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7d784fcc97-pxk9x" event={"ID":"405f2c62-fdef-4490-af87-b029c9f4b6cd","Type":"ContainerDied","Data":"010a0d1803d0a05cba870e004d44275a6614448ead96e498209b0fc8e2dfde6a"} Apr 20 14:38:40.823829 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.823760 2581 scope.go:117] "RemoveContainer" containerID="3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9" Apr 20 14:38:40.832304 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.832281 2581 scope.go:117] "RemoveContainer" containerID="3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9" Apr 20 14:38:40.832558 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:38:40.832538 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9\": container with ID starting with 3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9 not found: ID does not exist" containerID="3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9" Apr 20 14:38:40.832618 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.832566 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9"} err="failed to get container status \"3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9\": rpc error: code = NotFound desc = could not find container \"3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9\": container with ID starting with 3c2db3a2b8b2a23673aae71efd4acbb2b4b7f133d1d78a73e8a47e14c70977a9 not found: ID does not exist" Apr 20 14:38:40.842424 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.842396 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7d784fcc97-pxk9x"] Apr 20 14:38:40.847742 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:40.847702 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7d784fcc97-pxk9x"] Apr 20 14:38:41.829072 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:41.829034 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" event={"ID":"3c816bda-b0d3-483b-a123-96d13cf52d34","Type":"ContainerStarted","Data":"3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb"} Apr 20 14:38:41.829385 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:41.829159 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:38:41.847428 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:41.847379 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" podStartSLOduration=1.546660535 podStartE2EDuration="1.847366928s" podCreationTimestamp="2026-04-20 14:38:40 +0000 UTC" firstStartedPulling="2026-04-20 14:38:40.581363132 +0000 UTC m=+700.760851581" lastFinishedPulling="2026-04-20 14:38:40.882069513 +0000 UTC m=+701.061557974" observedRunningTime="2026-04-20 14:38:41.843876627 +0000 UTC m=+702.023365094" watchObservedRunningTime="2026-04-20 14:38:41.847366928 +0000 UTC m=+702.026855395" Apr 20 14:38:42.387560 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:42.387527 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405f2c62-fdef-4490-af87-b029c9f4b6cd" path="/var/lib/kubelet/pods/405f2c62-fdef-4490-af87-b029c9f4b6cd/volumes" Apr 20 14:38:52.838576 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:52.838541 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:38:52.878782 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:52.878748 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-54cb8f5bbb-v2jbv"] Apr 20 14:38:52.879084 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:52.879030 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" podUID="5229efcb-1902-4240-a871-37ce504bdc16" containerName="manager" containerID="cri-o://6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04" gracePeriod=10 Apr 20 14:38:53.123801 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.123781 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:53.188510 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.188486 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjtxx\" (UniqueName: \"kubernetes.io/projected/5229efcb-1902-4240-a871-37ce504bdc16-kube-api-access-mjtxx\") pod \"5229efcb-1902-4240-a871-37ce504bdc16\" (UID: \"5229efcb-1902-4240-a871-37ce504bdc16\") " Apr 20 14:38:53.190574 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.190544 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5229efcb-1902-4240-a871-37ce504bdc16-kube-api-access-mjtxx" (OuterVolumeSpecName: "kube-api-access-mjtxx") pod "5229efcb-1902-4240-a871-37ce504bdc16" (UID: "5229efcb-1902-4240-a871-37ce504bdc16"). InnerVolumeSpecName "kube-api-access-mjtxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:38:53.289190 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.289166 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjtxx\" (UniqueName: \"kubernetes.io/projected/5229efcb-1902-4240-a871-37ce504bdc16-kube-api-access-mjtxx\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:38:53.875413 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.875371 2581 generic.go:358] "Generic (PLEG): container finished" podID="5229efcb-1902-4240-a871-37ce504bdc16" containerID="6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04" exitCode=0 Apr 20 14:38:53.875884 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.875430 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" Apr 20 14:38:53.875884 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.875449 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" event={"ID":"5229efcb-1902-4240-a871-37ce504bdc16","Type":"ContainerDied","Data":"6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04"} Apr 20 14:38:53.875884 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.875488 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54cb8f5bbb-v2jbv" event={"ID":"5229efcb-1902-4240-a871-37ce504bdc16","Type":"ContainerDied","Data":"6b2db3480bc5b6c5a0059de47c479d9a05d3c528e5d60759d9775b68cd4b4bfe"} Apr 20 14:38:53.875884 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.875503 2581 scope.go:117] "RemoveContainer" containerID="6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04" Apr 20 14:38:53.885110 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.884913 2581 scope.go:117] "RemoveContainer" containerID="6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04" Apr 20 14:38:53.885184 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:38:53.885166 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04\": container with ID starting with 6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04 not found: ID does not exist" containerID="6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04" Apr 20 14:38:53.885225 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.885195 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04"} err="failed to get container status \"6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04\": rpc error: code = NotFound desc = could not find container \"6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04\": container with ID starting with 6f870133bfaf98aa476a21f88d47b3dc4481b92172ed6b6092ad39398fc29d04 not found: ID does not exist" Apr 20 14:38:53.901084 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.901056 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-54cb8f5bbb-v2jbv"] Apr 20 14:38:53.903252 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:53.903232 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-54cb8f5bbb-v2jbv"] Apr 20 14:38:54.387023 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:38:54.386997 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5229efcb-1902-4240-a871-37ce504bdc16" path="/var/lib/kubelet/pods/5229efcb-1902-4240-a871-37ce504bdc16/volumes" Apr 20 14:39:11.630594 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.630556 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-79fdf5ff45-cw6jv"] Apr 20 14:39:11.631034 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.630942 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5229efcb-1902-4240-a871-37ce504bdc16" containerName="manager" Apr 20 14:39:11.631034 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.630955 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5229efcb-1902-4240-a871-37ce504bdc16" containerName="manager" Apr 20 14:39:11.631034 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.631032 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="5229efcb-1902-4240-a871-37ce504bdc16" containerName="manager" Apr 20 14:39:11.634183 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.634164 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:11.643009 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.642989 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-79fdf5ff45-cw6jv"] Apr 20 14:39:11.730163 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.730129 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76083fc3-97c2-4b5a-be5c-a471f542b4f8-maas-api-tls\") pod \"maas-api-79fdf5ff45-cw6jv\" (UID: \"76083fc3-97c2-4b5a-be5c-a471f542b4f8\") " pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:11.730346 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.730178 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmkg\" (UniqueName: \"kubernetes.io/projected/76083fc3-97c2-4b5a-be5c-a471f542b4f8-kube-api-access-vfmkg\") pod \"maas-api-79fdf5ff45-cw6jv\" (UID: \"76083fc3-97c2-4b5a-be5c-a471f542b4f8\") " pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:11.830677 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.830637 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76083fc3-97c2-4b5a-be5c-a471f542b4f8-maas-api-tls\") pod \"maas-api-79fdf5ff45-cw6jv\" (UID: \"76083fc3-97c2-4b5a-be5c-a471f542b4f8\") " pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:11.830881 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.830692 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmkg\" (UniqueName: \"kubernetes.io/projected/76083fc3-97c2-4b5a-be5c-a471f542b4f8-kube-api-access-vfmkg\") pod \"maas-api-79fdf5ff45-cw6jv\" (UID: \"76083fc3-97c2-4b5a-be5c-a471f542b4f8\") " pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:11.833190 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.833166 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/76083fc3-97c2-4b5a-be5c-a471f542b4f8-maas-api-tls\") pod \"maas-api-79fdf5ff45-cw6jv\" (UID: \"76083fc3-97c2-4b5a-be5c-a471f542b4f8\") " pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:11.839208 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.839185 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmkg\" (UniqueName: \"kubernetes.io/projected/76083fc3-97c2-4b5a-be5c-a471f542b4f8-kube-api-access-vfmkg\") pod \"maas-api-79fdf5ff45-cw6jv\" (UID: \"76083fc3-97c2-4b5a-be5c-a471f542b4f8\") " pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:11.945836 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:11.945746 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:12.074826 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:12.074718 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-79fdf5ff45-cw6jv"] Apr 20 14:39:12.080406 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:39:12.080377 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76083fc3_97c2_4b5a_be5c_a471f542b4f8.slice/crio-60b65a65303d554caa9bac88f588d78d458a9e4180d2496c08f31c735a5f095c WatchSource:0}: Error finding container 60b65a65303d554caa9bac88f588d78d458a9e4180d2496c08f31c735a5f095c: Status 404 returned error can't find the container with id 60b65a65303d554caa9bac88f588d78d458a9e4180d2496c08f31c735a5f095c Apr 20 14:39:12.947507 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:12.947458 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79fdf5ff45-cw6jv" event={"ID":"76083fc3-97c2-4b5a-be5c-a471f542b4f8","Type":"ContainerStarted","Data":"60b65a65303d554caa9bac88f588d78d458a9e4180d2496c08f31c735a5f095c"} Apr 20 14:39:13.953000 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:13.952971 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-79fdf5ff45-cw6jv" event={"ID":"76083fc3-97c2-4b5a-be5c-a471f542b4f8","Type":"ContainerStarted","Data":"4c43875378a85504a71d12381bffe72106b181423518ae2f0aa11314a0a55950"} Apr 20 14:39:13.953331 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:13.953082 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:13.970870 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:13.970786 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-79fdf5ff45-cw6jv" podStartSLOduration=1.268253905 podStartE2EDuration="2.970770526s" podCreationTimestamp="2026-04-20 14:39:11 +0000 UTC" firstStartedPulling="2026-04-20 14:39:12.082024917 +0000 UTC m=+732.261513362" lastFinishedPulling="2026-04-20 14:39:13.784541524 +0000 UTC m=+733.964029983" observedRunningTime="2026-04-20 14:39:13.968290318 +0000 UTC m=+734.147778787" watchObservedRunningTime="2026-04-20 14:39:13.970770526 +0000 UTC m=+734.150258995" Apr 20 14:39:19.961491 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:19.961461 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-79fdf5ff45-cw6jv" Apr 20 14:39:20.004545 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.004510 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-84d488ddb8-6wh6w"] Apr 20 14:39:20.004803 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.004760 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-84d488ddb8-6wh6w" podUID="a19bbbaf-154d-4aa9-825c-b4cb25ab9975" containerName="maas-api" containerID="cri-o://d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436" gracePeriod=30 Apr 20 14:39:20.256046 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.256017 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:39:20.397930 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.397899 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-maas-api-tls\") pod \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\" (UID: \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\") " Apr 20 14:39:20.398092 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.397961 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tjtc\" (UniqueName: \"kubernetes.io/projected/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-kube-api-access-5tjtc\") pod \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\" (UID: \"a19bbbaf-154d-4aa9-825c-b4cb25ab9975\") " Apr 20 14:39:20.400062 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.400032 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "a19bbbaf-154d-4aa9-825c-b4cb25ab9975" (UID: "a19bbbaf-154d-4aa9-825c-b4cb25ab9975"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:39:20.400158 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.400079 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-kube-api-access-5tjtc" (OuterVolumeSpecName: "kube-api-access-5tjtc") pod "a19bbbaf-154d-4aa9-825c-b4cb25ab9975" (UID: "a19bbbaf-154d-4aa9-825c-b4cb25ab9975"). InnerVolumeSpecName "kube-api-access-5tjtc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:39:20.421257 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.421229 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv"] Apr 20 14:39:20.421615 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.421600 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a19bbbaf-154d-4aa9-825c-b4cb25ab9975" containerName="maas-api" Apr 20 14:39:20.421671 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.421618 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19bbbaf-154d-4aa9-825c-b4cb25ab9975" containerName="maas-api" Apr 20 14:39:20.421711 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.421690 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="a19bbbaf-154d-4aa9-825c-b4cb25ab9975" containerName="maas-api" Apr 20 14:39:20.424961 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.424947 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.427670 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.427653 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 14:39:20.427800 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.427681 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 14:39:20.427800 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.427701 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 14:39:20.427800 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.427653 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-5p5fx\"" Apr 20 14:39:20.436080 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.436059 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv"] Apr 20 14:39:20.499502 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.499433 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.499502 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.499477 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.499660 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.499605 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.499701 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.499666 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0784e86-5872-40bd-80ec-27610ad81940-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.499767 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.499717 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.499816 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.499778 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnbbx\" (UniqueName: \"kubernetes.io/projected/a0784e86-5872-40bd-80ec-27610ad81940-kube-api-access-nnbbx\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.499895 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.499880 2581 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-maas-api-tls\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:39:20.499938 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.499901 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tjtc\" (UniqueName: \"kubernetes.io/projected/a19bbbaf-154d-4aa9-825c-b4cb25ab9975-kube-api-access-5tjtc\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:39:20.601007 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.600967 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.601182 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.601039 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.601182 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.601069 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0784e86-5872-40bd-80ec-27610ad81940-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.601182 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.601112 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.601182 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.601140 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnbbx\" (UniqueName: \"kubernetes.io/projected/a0784e86-5872-40bd-80ec-27610ad81940-kube-api-access-nnbbx\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.601424 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.601221 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.601745 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.601552 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.601745 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.601582 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.601745 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.601668 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.604277 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.604251 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a0784e86-5872-40bd-80ec-27610ad81940-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.604473 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.604455 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a0784e86-5872-40bd-80ec-27610ad81940-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.608887 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.608865 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnbbx\" (UniqueName: \"kubernetes.io/projected/a0784e86-5872-40bd-80ec-27610ad81940-kube-api-access-nnbbx\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-n6swv\" (UID: \"a0784e86-5872-40bd-80ec-27610ad81940\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.735181 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.735151 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:20.862848 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.862826 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv"] Apr 20 14:39:20.864301 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:39:20.864272 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0784e86_5872_40bd_80ec_27610ad81940.slice/crio-2c35675e3621ccd278528041d6650c5b8c5a0fafce527113c2bdc487eb220f7c WatchSource:0}: Error finding container 2c35675e3621ccd278528041d6650c5b8c5a0fafce527113c2bdc487eb220f7c: Status 404 returned error can't find the container with id 2c35675e3621ccd278528041d6650c5b8c5a0fafce527113c2bdc487eb220f7c Apr 20 14:39:20.984045 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.984006 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" event={"ID":"a0784e86-5872-40bd-80ec-27610ad81940","Type":"ContainerStarted","Data":"2c35675e3621ccd278528041d6650c5b8c5a0fafce527113c2bdc487eb220f7c"} Apr 20 14:39:20.985228 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.985200 2581 generic.go:358] "Generic (PLEG): container finished" podID="a19bbbaf-154d-4aa9-825c-b4cb25ab9975" containerID="d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436" exitCode=0 Apr 20 14:39:20.985353 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.985262 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-84d488ddb8-6wh6w" event={"ID":"a19bbbaf-154d-4aa9-825c-b4cb25ab9975","Type":"ContainerDied","Data":"d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436"} Apr 20 14:39:20.985353 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.985272 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-84d488ddb8-6wh6w" Apr 20 14:39:20.985353 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.985286 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-84d488ddb8-6wh6w" event={"ID":"a19bbbaf-154d-4aa9-825c-b4cb25ab9975","Type":"ContainerDied","Data":"d8d8c078346e67f33fa0720b76b42e7960837d722173b37e87bcca56cf05426d"} Apr 20 14:39:20.985353 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.985301 2581 scope.go:117] "RemoveContainer" containerID="d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436" Apr 20 14:39:20.994074 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.994054 2581 scope.go:117] "RemoveContainer" containerID="d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436" Apr 20 14:39:20.994317 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:39:20.994298 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436\": container with ID starting with d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436 not found: ID does not exist" containerID="d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436" Apr 20 14:39:20.994414 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:20.994325 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436"} err="failed to get container status \"d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436\": rpc error: code = NotFound desc = could not find container \"d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436\": container with ID starting with d381ed1c3d821aaf43ce6603589184763465cbb64ef650094dccb759764fa436 not found: ID does not exist" Apr 20 14:39:21.009926 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:21.009893 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-84d488ddb8-6wh6w"] Apr 20 14:39:21.012907 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:21.012887 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-84d488ddb8-6wh6w"] Apr 20 14:39:22.388467 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:22.388431 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19bbbaf-154d-4aa9-825c-b4cb25ab9975" path="/var/lib/kubelet/pods/a19bbbaf-154d-4aa9-825c-b4cb25ab9975/volumes" Apr 20 14:39:27.705221 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.705189 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks"] Apr 20 14:39:27.708846 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.708830 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.711283 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.711263 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 14:39:27.719960 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.719935 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks"] Apr 20 14:39:27.872255 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.872053 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.872255 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.872105 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.872255 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.872132 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.872255 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.872162 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbvs9\" (UniqueName: \"kubernetes.io/projected/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-kube-api-access-rbvs9\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.872255 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.872207 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.872656 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.872266 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.973438 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.973391 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.973627 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.973503 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.973627 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.973531 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.973627 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.973558 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.973627 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.973587 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbvs9\" (UniqueName: \"kubernetes.io/projected/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-kube-api-access-rbvs9\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.973627 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.973613 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.974652 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.974625 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.974996 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.974970 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.975208 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.975188 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.976899 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.976849 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.978219 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.978191 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:27.982773 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:27.982718 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbvs9\" (UniqueName: \"kubernetes.io/projected/2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3-kube-api-access-rbvs9\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks\" (UID: \"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:28.020709 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:28.020672 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:28.022505 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:28.022471 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" event={"ID":"a0784e86-5872-40bd-80ec-27610ad81940","Type":"ContainerStarted","Data":"d9c8255477fb65b3db64853718c931eb4b0fca1fcf5cd460cd2ffcf49e3d7242"} Apr 20 14:39:28.365994 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:28.365966 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks"] Apr 20 14:39:29.029018 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:29.028956 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" event={"ID":"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3","Type":"ContainerStarted","Data":"07a3f3978839e233c6e0941b2a41f074b21ea80c895422f2ba077372deff49a5"} Apr 20 14:39:29.029018 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:29.028999 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" event={"ID":"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3","Type":"ContainerStarted","Data":"1162f076cba9513a4c34a072971c5dbaef6475332fcdd258b5fd4fb22a8d5060"} Apr 20 14:39:31.109888 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.109848 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q"] Apr 20 14:39:31.113705 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.113681 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.116241 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.116214 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 14:39:31.125589 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.125568 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q"] Apr 20 14:39:31.204485 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.204457 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.204686 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.204661 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/794d9478-89a5-4434-a337-2ac15854e724-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.204797 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.204712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.204859 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.204811 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.204915 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.204864 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.204915 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.204892 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxm8\" (UniqueName: \"kubernetes.io/projected/794d9478-89a5-4434-a337-2ac15854e724-kube-api-access-tpxm8\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306014 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.305690 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306014 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.305788 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306014 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.305814 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxm8\" (UniqueName: \"kubernetes.io/projected/794d9478-89a5-4434-a337-2ac15854e724-kube-api-access-tpxm8\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306014 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.305873 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306014 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.305907 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/794d9478-89a5-4434-a337-2ac15854e724-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306014 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.305937 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306372 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.306118 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306601 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.306578 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.306780 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.306637 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.308909 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.308888 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/794d9478-89a5-4434-a337-2ac15854e724-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.309031 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.308954 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/794d9478-89a5-4434-a337-2ac15854e724-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.314763 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.314712 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxm8\" (UniqueName: \"kubernetes.io/projected/794d9478-89a5-4434-a337-2ac15854e724-kube-api-access-tpxm8\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q\" (UID: \"794d9478-89a5-4434-a337-2ac15854e724\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.426483 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.425983 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:31.784297 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:31.784258 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q"] Apr 20 14:39:31.786247 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:39:31.786186 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod794d9478_89a5_4434_a337_2ac15854e724.slice/crio-3fe3b2bbd17a0f6cd3f3fdaceab3aefff7818c8180ddc73cbc6bbf1a61022d54 WatchSource:0}: Error finding container 3fe3b2bbd17a0f6cd3f3fdaceab3aefff7818c8180ddc73cbc6bbf1a61022d54: Status 404 returned error can't find the container with id 3fe3b2bbd17a0f6cd3f3fdaceab3aefff7818c8180ddc73cbc6bbf1a61022d54 Apr 20 14:39:32.044163 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:32.043412 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" event={"ID":"794d9478-89a5-4434-a337-2ac15854e724","Type":"ContainerStarted","Data":"42406c6b18951bbeb2dc3286f05a25e0d6f3ac265caacedc45f279500e4125e7"} Apr 20 14:39:32.044163 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:32.043450 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" event={"ID":"794d9478-89a5-4434-a337-2ac15854e724","Type":"ContainerStarted","Data":"3fe3b2bbd17a0f6cd3f3fdaceab3aefff7818c8180ddc73cbc6bbf1a61022d54"} Apr 20 14:39:34.053578 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:34.053545 2581 generic.go:358] "Generic (PLEG): container finished" podID="a0784e86-5872-40bd-80ec-27610ad81940" containerID="d9c8255477fb65b3db64853718c931eb4b0fca1fcf5cd460cd2ffcf49e3d7242" exitCode=0 Apr 20 14:39:34.053980 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:34.053616 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" event={"ID":"a0784e86-5872-40bd-80ec-27610ad81940","Type":"ContainerDied","Data":"d9c8255477fb65b3db64853718c931eb4b0fca1fcf5cd460cd2ffcf49e3d7242"} Apr 20 14:39:35.059450 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:35.059411 2581 generic.go:358] "Generic (PLEG): container finished" podID="2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3" containerID="07a3f3978839e233c6e0941b2a41f074b21ea80c895422f2ba077372deff49a5" exitCode=0 Apr 20 14:39:35.060068 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:35.059501 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" event={"ID":"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3","Type":"ContainerDied","Data":"07a3f3978839e233c6e0941b2a41f074b21ea80c895422f2ba077372deff49a5"} Apr 20 14:39:36.064704 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:36.064671 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" event={"ID":"a0784e86-5872-40bd-80ec-27610ad81940","Type":"ContainerStarted","Data":"9d4c5ff6fb4cd7e08e9c534ba4eb50ed2fc1bd4d30c844b3ee136dc545516fdc"} Apr 20 14:39:36.065202 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:36.064929 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:36.066426 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:36.066401 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" event={"ID":"2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3","Type":"ContainerStarted","Data":"6094ad16d21c5c3267626be9462880b12270c4a65de930e78aeb2532f684249d"} Apr 20 14:39:36.066807 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:36.066778 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:36.083959 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:36.083912 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" podStartSLOduration=1.425123086 podStartE2EDuration="16.083900166s" podCreationTimestamp="2026-04-20 14:39:20 +0000 UTC" firstStartedPulling="2026-04-20 14:39:20.866055509 +0000 UTC m=+741.045543956" lastFinishedPulling="2026-04-20 14:39:35.52483259 +0000 UTC m=+755.704321036" observedRunningTime="2026-04-20 14:39:36.081401717 +0000 UTC m=+756.260890197" watchObservedRunningTime="2026-04-20 14:39:36.083900166 +0000 UTC m=+756.263388631" Apr 20 14:39:36.101402 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:36.101359 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" podStartSLOduration=8.636100426 podStartE2EDuration="9.101344348s" podCreationTimestamp="2026-04-20 14:39:27 +0000 UTC" firstStartedPulling="2026-04-20 14:39:35.060073271 +0000 UTC m=+755.239561717" lastFinishedPulling="2026-04-20 14:39:35.52531718 +0000 UTC m=+755.704805639" observedRunningTime="2026-04-20 14:39:36.099596888 +0000 UTC m=+756.279085356" watchObservedRunningTime="2026-04-20 14:39:36.101344348 +0000 UTC m=+756.280832860" Apr 20 14:39:38.078259 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:38.078225 2581 generic.go:358] "Generic (PLEG): container finished" podID="794d9478-89a5-4434-a337-2ac15854e724" containerID="42406c6b18951bbeb2dc3286f05a25e0d6f3ac265caacedc45f279500e4125e7" exitCode=0 Apr 20 14:39:38.078624 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:38.078309 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" event={"ID":"794d9478-89a5-4434-a337-2ac15854e724","Type":"ContainerDied","Data":"42406c6b18951bbeb2dc3286f05a25e0d6f3ac265caacedc45f279500e4125e7"} Apr 20 14:39:39.084596 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:39.084549 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" event={"ID":"794d9478-89a5-4434-a337-2ac15854e724","Type":"ContainerStarted","Data":"628d94d154df539245bdd42b6f7eb9907869a9378fdeb1f3c3aaf34de5e8f3ec"} Apr 20 14:39:39.084985 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:39.084855 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:39.102070 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:39.102031 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" podStartSLOduration=7.826654352 podStartE2EDuration="8.102020649s" podCreationTimestamp="2026-04-20 14:39:31 +0000 UTC" firstStartedPulling="2026-04-20 14:39:38.079160933 +0000 UTC m=+758.258649380" lastFinishedPulling="2026-04-20 14:39:38.354527231 +0000 UTC m=+758.534015677" observedRunningTime="2026-04-20 14:39:39.100073676 +0000 UTC m=+759.279562145" watchObservedRunningTime="2026-04-20 14:39:39.102020649 +0000 UTC m=+759.281509125" Apr 20 14:39:47.086984 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:47.086949 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks" Apr 20 14:39:47.087940 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:47.087923 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-n6swv" Apr 20 14:39:50.102604 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:50.102561 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q" Apr 20 14:39:57.123604 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.123556 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft"] Apr 20 14:39:57.155474 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.155447 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft"] Apr 20 14:39:57.155642 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.155546 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.158460 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.158436 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 14:39:57.232689 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.232655 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkkgm\" (UniqueName: \"kubernetes.io/projected/550bc51d-37ae-45c6-985b-ef22843e6040-kube-api-access-hkkgm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.232863 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.232707 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.232863 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.232757 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.232863 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.232797 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.233019 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.232905 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.233019 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.232950 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/550bc51d-37ae-45c6-985b-ef22843e6040-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334228 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334201 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkkgm\" (UniqueName: \"kubernetes.io/projected/550bc51d-37ae-45c6-985b-ef22843e6040-kube-api-access-hkkgm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334352 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334247 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334352 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334274 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334352 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334302 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334502 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334502 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334391 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/550bc51d-37ae-45c6-985b-ef22843e6040-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334687 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334657 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334827 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334701 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.334827 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.334792 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.336640 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.336611 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/550bc51d-37ae-45c6-985b-ef22843e6040-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.336904 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.336888 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/550bc51d-37ae-45c6-985b-ef22843e6040-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.341687 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.341665 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkkgm\" (UniqueName: \"kubernetes.io/projected/550bc51d-37ae-45c6-985b-ef22843e6040-kube-api-access-hkkgm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-69bft\" (UID: \"550bc51d-37ae-45c6-985b-ef22843e6040\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.464894 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.464872 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:39:57.606860 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:57.606832 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft"] Apr 20 14:39:57.608680 ip-10-0-139-136 kubenswrapper[2581]: W0420 14:39:57.608640 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod550bc51d_37ae_45c6_985b_ef22843e6040.slice/crio-01ed8ffed5f49aa324e299d0428d22b9be36804b991ea4e7d0b9fe6f9bdc5421 WatchSource:0}: Error finding container 01ed8ffed5f49aa324e299d0428d22b9be36804b991ea4e7d0b9fe6f9bdc5421: Status 404 returned error can't find the container with id 01ed8ffed5f49aa324e299d0428d22b9be36804b991ea4e7d0b9fe6f9bdc5421 Apr 20 14:39:58.165558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.165518 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" event={"ID":"550bc51d-37ae-45c6-985b-ef22843e6040","Type":"ContainerStarted","Data":"72ae7febaca034adb671f415c3837dd4f85df78345ba5e6b7d300b4fd04fdac7"} Apr 20 14:39:58.165558 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.165559 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" event={"ID":"550bc51d-37ae-45c6-985b-ef22843e6040","Type":"ContainerStarted","Data":"01ed8ffed5f49aa324e299d0428d22b9be36804b991ea4e7d0b9fe6f9bdc5421"} Apr 20 14:39:58.612142 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.612109 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c"] Apr 20 14:39:58.615653 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.615632 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.618261 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.618239 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 14:39:58.626447 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.626424 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c"] Apr 20 14:39:58.745789 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.745758 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqgj\" (UniqueName: \"kubernetes.io/projected/a916b57d-5be1-4b40-a115-f81f6bc18d77-kube-api-access-kmqgj\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.745959 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.745809 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.745959 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.745866 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a916b57d-5be1-4b40-a115-f81f6bc18d77-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.745959 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.745929 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.746072 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.745991 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.746072 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.746035 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.846455 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.846420 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.846634 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.846467 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a916b57d-5be1-4b40-a115-f81f6bc18d77-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.846634 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.846496 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.846634 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.846532 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.846634 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.846565 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.846889 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.846713 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqgj\" (UniqueName: \"kubernetes.io/projected/a916b57d-5be1-4b40-a115-f81f6bc18d77-kube-api-access-kmqgj\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.846946 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.846891 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.847020 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.846999 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.847069 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.847015 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.848992 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.848972 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a916b57d-5be1-4b40-a115-f81f6bc18d77-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.849256 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.849239 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a916b57d-5be1-4b40-a115-f81f6bc18d77-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.854387 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.854366 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqgj\" (UniqueName: \"kubernetes.io/projected/a916b57d-5be1-4b40-a115-f81f6bc18d77-kube-api-access-kmqgj\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c\" (UID: \"a916b57d-5be1-4b40-a115-f81f6bc18d77\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:58.928112 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:58.928047 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:39:59.105750 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:59.105159 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c"] Apr 20 14:39:59.172714 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:39:59.172684 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" event={"ID":"a916b57d-5be1-4b40-a115-f81f6bc18d77","Type":"ContainerStarted","Data":"098f843f65ba9e67f52a647a89e514e941b49937111f830865dd440be9152ec5"} Apr 20 14:40:00.179483 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:00.179439 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" event={"ID":"a916b57d-5be1-4b40-a115-f81f6bc18d77","Type":"ContainerStarted","Data":"d5ec8cb6cfa6ecc30c9db11aaa933b40dab1a552cd06279c825b100a0ca8fe1c"} Apr 20 14:40:05.203591 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:05.203476 2581 generic.go:358] "Generic (PLEG): container finished" podID="a916b57d-5be1-4b40-a115-f81f6bc18d77" containerID="d5ec8cb6cfa6ecc30c9db11aaa933b40dab1a552cd06279c825b100a0ca8fe1c" exitCode=0 Apr 20 14:40:05.203591 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:05.203548 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" event={"ID":"a916b57d-5be1-4b40-a115-f81f6bc18d77","Type":"ContainerDied","Data":"d5ec8cb6cfa6ecc30c9db11aaa933b40dab1a552cd06279c825b100a0ca8fe1c"} Apr 20 14:40:06.210662 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:06.209224 2581 generic.go:358] "Generic (PLEG): container finished" podID="550bc51d-37ae-45c6-985b-ef22843e6040" containerID="72ae7febaca034adb671f415c3837dd4f85df78345ba5e6b7d300b4fd04fdac7" exitCode=0 Apr 20 14:40:06.210662 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:06.209391 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" event={"ID":"550bc51d-37ae-45c6-985b-ef22843e6040","Type":"ContainerDied","Data":"72ae7febaca034adb671f415c3837dd4f85df78345ba5e6b7d300b4fd04fdac7"} Apr 20 14:40:06.214241 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:06.214214 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" event={"ID":"a916b57d-5be1-4b40-a115-f81f6bc18d77","Type":"ContainerStarted","Data":"307249b59fc82f3bebdb753213c56b0247f7eea8bfca5beb5f1aa5acff19a8ca"} Apr 20 14:40:06.214447 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:06.214427 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:40:06.246057 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:06.246005 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" podStartSLOduration=8.00507725 podStartE2EDuration="8.245992701s" podCreationTimestamp="2026-04-20 14:39:58 +0000 UTC" firstStartedPulling="2026-04-20 14:40:05.204274425 +0000 UTC m=+785.383762871" lastFinishedPulling="2026-04-20 14:40:05.44518986 +0000 UTC m=+785.624678322" observedRunningTime="2026-04-20 14:40:06.242292986 +0000 UTC m=+786.421781454" watchObservedRunningTime="2026-04-20 14:40:06.245992701 +0000 UTC m=+786.425481168" Apr 20 14:40:07.220024 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:07.219990 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" event={"ID":"550bc51d-37ae-45c6-985b-ef22843e6040","Type":"ContainerStarted","Data":"ebd6d640a6188ff7c04d6c7eefb24eda77e3688497653795cc525a3c73263702"} Apr 20 14:40:07.238083 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:07.238033 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" podStartSLOduration=9.975373852 podStartE2EDuration="10.23801902s" podCreationTimestamp="2026-04-20 14:39:57 +0000 UTC" firstStartedPulling="2026-04-20 14:40:06.210175176 +0000 UTC m=+786.389663640" lastFinishedPulling="2026-04-20 14:40:06.472820341 +0000 UTC m=+786.652308808" observedRunningTime="2026-04-20 14:40:07.236742608 +0000 UTC m=+787.416231069" watchObservedRunningTime="2026-04-20 14:40:07.23801902 +0000 UTC m=+787.417507488" Apr 20 14:40:17.220987 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:17.220951 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:40:17.239108 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:17.239083 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c" Apr 20 14:40:17.239497 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:40:17.239480 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-69bft" Apr 20 14:41:53.639435 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:53.639361 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-79cb8b9576-tlf2w"] Apr 20 14:41:53.639907 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:53.639552 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" podUID="3c816bda-b0d3-483b-a123-96d13cf52d34" containerName="manager" containerID="cri-o://3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb" gracePeriod=10 Apr 20 14:41:53.884638 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:53.884617 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:41:53.971158 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:53.971132 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm8hj\" (UniqueName: \"kubernetes.io/projected/3c816bda-b0d3-483b-a123-96d13cf52d34-kube-api-access-fm8hj\") pod \"3c816bda-b0d3-483b-a123-96d13cf52d34\" (UID: \"3c816bda-b0d3-483b-a123-96d13cf52d34\") " Apr 20 14:41:53.973314 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:53.973288 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c816bda-b0d3-483b-a123-96d13cf52d34-kube-api-access-fm8hj" (OuterVolumeSpecName: "kube-api-access-fm8hj") pod "3c816bda-b0d3-483b-a123-96d13cf52d34" (UID: "3c816bda-b0d3-483b-a123-96d13cf52d34"). InnerVolumeSpecName "kube-api-access-fm8hj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:41:54.071953 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.071927 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fm8hj\" (UniqueName: \"kubernetes.io/projected/3c816bda-b0d3-483b-a123-96d13cf52d34-kube-api-access-fm8hj\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:41:54.644066 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.644031 2581 generic.go:358] "Generic (PLEG): container finished" podID="3c816bda-b0d3-483b-a123-96d13cf52d34" containerID="3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb" exitCode=0 Apr 20 14:41:54.644480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.644096 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" event={"ID":"3c816bda-b0d3-483b-a123-96d13cf52d34","Type":"ContainerDied","Data":"3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb"} Apr 20 14:41:54.644480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.644116 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" Apr 20 14:41:54.644480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.644130 2581 scope.go:117] "RemoveContainer" containerID="3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb" Apr 20 14:41:54.644480 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.644121 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79cb8b9576-tlf2w" event={"ID":"3c816bda-b0d3-483b-a123-96d13cf52d34","Type":"ContainerDied","Data":"1a2661eaf8a0ef664b5d97d9a6781ef657d7805c99837573f73a3c2a79be9445"} Apr 20 14:41:54.653576 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.653560 2581 scope.go:117] "RemoveContainer" containerID="3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb" Apr 20 14:41:54.653872 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:41:54.653844 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb\": container with ID starting with 3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb not found: ID does not exist" containerID="3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb" Apr 20 14:41:54.653930 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.653880 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb"} err="failed to get container status \"3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb\": rpc error: code = NotFound desc = could not find container \"3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb\": container with ID starting with 3ada04e5c2686bcf54b6a55565e8462f03a63a6afd68de0b367a6abdba16fceb not found: ID does not exist" Apr 20 14:41:54.660801 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.660779 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-79cb8b9576-tlf2w"] Apr 20 14:41:54.666451 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:54.666427 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-79cb8b9576-tlf2w"] Apr 20 14:41:56.388633 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:41:56.388593 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c816bda-b0d3-483b-a123-96d13cf52d34" path="/var/lib/kubelet/pods/3c816bda-b0d3-483b-a123-96d13cf52d34/volumes" Apr 20 14:42:00.327293 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:42:00.327264 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:42:00.327931 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:42:00.327912 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:42:00.332974 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:42:00.332953 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:42:00.333829 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:42:00.333805 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:47:00.359653 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:47:00.359623 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:47:00.365265 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:47:00.365240 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:47:00.365456 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:47:00.365438 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:47:00.371229 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:47:00.371210 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:52:00.397239 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:00.397203 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:52:00.403134 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:00.403110 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:52:00.405180 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:00.405159 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:52:00.410534 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:00.410516 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:52:09.154113 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.154027 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks"] Apr 20 14:52:09.154560 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.154349 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" podUID="3250d467-6a26-4a86-bca3-2c39b4dd7245" containerName="manager" containerID="cri-o://3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add" gracePeriod=10 Apr 20 14:52:09.709019 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.708998 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:52:09.813661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.813587 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ts47\" (UniqueName: \"kubernetes.io/projected/3250d467-6a26-4a86-bca3-2c39b4dd7245-kube-api-access-5ts47\") pod \"3250d467-6a26-4a86-bca3-2c39b4dd7245\" (UID: \"3250d467-6a26-4a86-bca3-2c39b4dd7245\") " Apr 20 14:52:09.813661 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.813652 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3250d467-6a26-4a86-bca3-2c39b4dd7245-extensions-socket-volume\") pod \"3250d467-6a26-4a86-bca3-2c39b4dd7245\" (UID: \"3250d467-6a26-4a86-bca3-2c39b4dd7245\") " Apr 20 14:52:09.814063 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.814039 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3250d467-6a26-4a86-bca3-2c39b4dd7245-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "3250d467-6a26-4a86-bca3-2c39b4dd7245" (UID: "3250d467-6a26-4a86-bca3-2c39b4dd7245"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:52:09.815837 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.815798 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3250d467-6a26-4a86-bca3-2c39b4dd7245-kube-api-access-5ts47" (OuterVolumeSpecName: "kube-api-access-5ts47") pod "3250d467-6a26-4a86-bca3-2c39b4dd7245" (UID: "3250d467-6a26-4a86-bca3-2c39b4dd7245"). InnerVolumeSpecName "kube-api-access-5ts47". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:52:09.915098 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.915074 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ts47\" (UniqueName: \"kubernetes.io/projected/3250d467-6a26-4a86-bca3-2c39b4dd7245-kube-api-access-5ts47\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:52:09.915098 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:09.915099 2581 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3250d467-6a26-4a86-bca3-2c39b4dd7245-extensions-socket-volume\") on node \"ip-10-0-139-136.ec2.internal\" DevicePath \"\"" Apr 20 14:52:10.089282 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.089213 2581 generic.go:358] "Generic (PLEG): container finished" podID="3250d467-6a26-4a86-bca3-2c39b4dd7245" containerID="3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add" exitCode=0 Apr 20 14:52:10.089417 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.089302 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" Apr 20 14:52:10.089417 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.089307 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" event={"ID":"3250d467-6a26-4a86-bca3-2c39b4dd7245","Type":"ContainerDied","Data":"3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add"} Apr 20 14:52:10.089417 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.089356 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks" event={"ID":"3250d467-6a26-4a86-bca3-2c39b4dd7245","Type":"ContainerDied","Data":"39f192415eb0096e71400aa8622cd1fd393e9eb19c8cab5c4ee0e3130494bc69"} Apr 20 14:52:10.089417 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.089377 2581 scope.go:117] "RemoveContainer" containerID="3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add" Apr 20 14:52:10.099746 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.099705 2581 scope.go:117] "RemoveContainer" containerID="3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add" Apr 20 14:52:10.100012 ip-10-0-139-136 kubenswrapper[2581]: E0420 14:52:10.099996 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add\": container with ID starting with 3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add not found: ID does not exist" containerID="3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add" Apr 20 14:52:10.100064 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.100020 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add"} err="failed to get container status \"3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add\": rpc error: code = NotFound desc = could not find container \"3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add\": container with ID starting with 3e1f6ea526b8978510f8625f4c59148377d803c5176dcbbd5229c1994ae38add not found: ID does not exist" Apr 20 14:52:10.113041 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.113017 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks"] Apr 20 14:52:10.116381 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.116361 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-4hjks"] Apr 20 14:52:10.387785 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:52:10.387703 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3250d467-6a26-4a86-bca3-2c39b4dd7245" path="/var/lib/kubelet/pods/3250d467-6a26-4a86-bca3-2c39b4dd7245/volumes" Apr 20 14:57:00.430421 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:57:00.430314 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:57:00.437169 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:57:00.437148 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 14:57:00.444198 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:57:00.444176 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 14:57:00.449978 ip-10-0-139-136 kubenswrapper[2581]: I0420 14:57:00.449962 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 15:02:00.466811 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:00.466787 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 15:02:00.472186 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:00.472167 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 15:02:00.477517 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:00.477499 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 15:02:00.482940 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:00.482923 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 15:02:55.627636 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:55.627563 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-bs64c_b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41/manager/0.log" Apr 20 15:02:55.739646 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:55.739619 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-79fdf5ff45-cw6jv_76083fc3-97c2-4b5a-be5c-a471f542b4f8/maas-api/0.log" Apr 20 15:02:55.988086 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:55.988051 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-nkhrj_ba024e5e-b9fa-4ab8-bb53-1fe317a59889/manager/1.log" Apr 20 15:02:56.223711 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:56.223680 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65c545df94-ssz28_b65f99ff-f35b-4eb0-bb2f-54ff0e147fab/manager/0.log" Apr 20 15:02:56.443545 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:56.443463 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-5kr8w_f42738c5-6c35-4d09-aa1d-ae5ef4aae006/postgres/0.log" Apr 20 15:02:57.238480 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.238456 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655_e6852ab4-9ca9-4ec0-8f18-753fa397a740/util/0.log" Apr 20 15:02:57.245664 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.245635 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655_e6852ab4-9ca9-4ec0-8f18-753fa397a740/pull/0.log" Apr 20 15:02:57.253624 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.253565 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655_e6852ab4-9ca9-4ec0-8f18-753fa397a740/extract/0.log" Apr 20 15:02:57.380718 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.380696 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr_b7273032-549f-4b91-a2a8-308fe2e1e6ae/extract/0.log" Apr 20 15:02:57.386387 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.386365 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr_b7273032-549f-4b91-a2a8-308fe2e1e6ae/util/0.log" Apr 20 15:02:57.392406 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.392386 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr_b7273032-549f-4b91-a2a8-308fe2e1e6ae/pull/0.log" Apr 20 15:02:57.501036 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.500972 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5_c5d3f867-bd46-458e-abe1-18fcecec62c0/extract/0.log" Apr 20 15:02:57.506897 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.506883 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5_c5d3f867-bd46-458e-abe1-18fcecec62c0/util/0.log" Apr 20 15:02:57.512781 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.512759 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5_c5d3f867-bd46-458e-abe1-18fcecec62c0/pull/0.log" Apr 20 15:02:57.619820 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.619778 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn_f3816e90-952c-4608-882c-5ed7a62ee98e/util/0.log" Apr 20 15:02:57.626348 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.626311 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn_f3816e90-952c-4608-882c-5ed7a62ee98e/pull/0.log" Apr 20 15:02:57.632031 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:57.631996 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn_f3816e90-952c-4608-882c-5ed7a62ee98e/extract/0.log" Apr 20 15:02:59.150611 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:59.150582 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-9zc9q_6b1bb7fc-bf8e-4cc9-b695-286dac851053/discovery/0.log" Apr 20 15:02:59.387074 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:59.387050 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-87db58fcf-kth4n_cc9fc9e7-8226-482c-a827-42039d4cc2b3/kube-auth-proxy/0.log" Apr 20 15:02:59.610877 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:59.610844 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7748fc9578-6ldxb_9e88f589-217b-4f8f-a0a4-7289dd42caff/router/0.log" Apr 20 15:02:59.937549 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:59.937472 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks_2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3/storage-initializer/0.log" Apr 20 15:02:59.944241 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:02:59.944217 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-vkmks_2b5b2e7d-fa3e-4efe-b8c2-90f0948dcef3/main/0.log" Apr 20 15:03:00.057758 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:00.057689 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q_794d9478-89a5-4434-a337-2ac15854e724/storage-initializer/0.log" Apr 20 15:03:00.064662 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:00.064642 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-s8f7q_794d9478-89a5-4434-a337-2ac15854e724/main/0.log" Apr 20 15:03:00.170815 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:00.170791 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-n6swv_a0784e86-5872-40bd-80ec-27610ad81940/storage-initializer/0.log" Apr 20 15:03:00.177804 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:00.177783 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-n6swv_a0784e86-5872-40bd-80ec-27610ad81940/main/0.log" Apr 20 15:03:00.414976 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:00.414943 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c_a916b57d-5be1-4b40-a115-f81f6bc18d77/storage-initializer/0.log" Apr 20 15:03:00.421620 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:00.421594 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-ldn6c_a916b57d-5be1-4b40-a115-f81f6bc18d77/main/0.log" Apr 20 15:03:00.537129 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:00.537096 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-69bft_550bc51d-37ae-45c6-985b-ef22843e6040/storage-initializer/0.log" Apr 20 15:03:00.544037 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:00.544021 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-69bft_550bc51d-37ae-45c6-985b-ef22843e6040/main/0.log" Apr 20 15:03:07.833233 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:07.833203 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-4tqnb_a0fe6fd9-cb83-4fb9-94b4-1115d7285c71/global-pull-secret-syncer/0.log" Apr 20 15:03:07.996829 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:07.996802 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fz7q2_6b54dd67-12ec-486a-ac4c-dcdccdd01de9/konnectivity-agent/0.log" Apr 20 15:03:08.113658 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:08.113568 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-136.ec2.internal_57b16f66466a195429f73bc1a0dcec09/haproxy/0.log" Apr 20 15:03:12.112079 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.112035 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655_e6852ab4-9ca9-4ec0-8f18-753fa397a740/extract/0.log" Apr 20 15:03:12.143602 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.143577 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655_e6852ab4-9ca9-4ec0-8f18-753fa397a740/util/0.log" Apr 20 15:03:12.169905 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.169879 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xb655_e6852ab4-9ca9-4ec0-8f18-753fa397a740/pull/0.log" Apr 20 15:03:12.206386 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.206278 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr_b7273032-549f-4b91-a2a8-308fe2e1e6ae/extract/0.log" Apr 20 15:03:12.230173 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.230138 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr_b7273032-549f-4b91-a2a8-308fe2e1e6ae/util/0.log" Apr 20 15:03:12.253282 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.253264 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0nqjrr_b7273032-549f-4b91-a2a8-308fe2e1e6ae/pull/0.log" Apr 20 15:03:12.295533 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.295477 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5_c5d3f867-bd46-458e-abe1-18fcecec62c0/extract/0.log" Apr 20 15:03:12.317562 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.317541 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5_c5d3f867-bd46-458e-abe1-18fcecec62c0/util/0.log" Apr 20 15:03:12.340879 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.340861 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73vpcg5_c5d3f867-bd46-458e-abe1-18fcecec62c0/pull/0.log" Apr 20 15:03:12.373694 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.373678 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn_f3816e90-952c-4608-882c-5ed7a62ee98e/extract/0.log" Apr 20 15:03:12.399878 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.399864 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn_f3816e90-952c-4608-882c-5ed7a62ee98e/util/0.log" Apr 20 15:03:12.436245 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:12.436223 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1w4dmn_f3816e90-952c-4608-882c-5ed7a62ee98e/pull/0.log" Apr 20 15:03:14.719897 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:14.719870 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4x946_9bd2c509-8ac1-41be-a6d6-6741cf3e3491/monitoring-plugin/0.log" Apr 20 15:03:14.755098 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:14.755074 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l65ch_9682a026-b717-4f08-a6a3-d0eea65a227e/node-exporter/0.log" Apr 20 15:03:14.780354 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:14.780332 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l65ch_9682a026-b717-4f08-a6a3-d0eea65a227e/kube-rbac-proxy/0.log" Apr 20 15:03:14.811451 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:14.811434 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-l65ch_9682a026-b717-4f08-a6a3-d0eea65a227e/init-textfile/0.log" Apr 20 15:03:16.779901 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.779864 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck"] Apr 20 15:03:16.780415 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.780395 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3250d467-6a26-4a86-bca3-2c39b4dd7245" containerName="manager" Apr 20 15:03:16.780497 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.780418 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="3250d467-6a26-4a86-bca3-2c39b4dd7245" containerName="manager" Apr 20 15:03:16.780497 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.780447 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c816bda-b0d3-483b-a123-96d13cf52d34" containerName="manager" Apr 20 15:03:16.780497 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.780456 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c816bda-b0d3-483b-a123-96d13cf52d34" containerName="manager" Apr 20 15:03:16.780648 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.780538 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="3250d467-6a26-4a86-bca3-2c39b4dd7245" containerName="manager" Apr 20 15:03:16.780648 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.780553 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c816bda-b0d3-483b-a123-96d13cf52d34" containerName="manager" Apr 20 15:03:16.783589 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.783569 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.786097 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.786075 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xvzk6\"/\"default-dockercfg-5f9zf\"" Apr 20 15:03:16.786199 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.786142 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xvzk6\"/\"openshift-service-ca.crt\"" Apr 20 15:03:16.786961 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.786941 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xvzk6\"/\"kube-root-ca.crt\"" Apr 20 15:03:16.793574 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.793552 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck"] Apr 20 15:03:16.849511 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.849484 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-proc\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.849616 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.849517 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-lib-modules\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.849616 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.849537 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzjnq\" (UniqueName: \"kubernetes.io/projected/05e1f532-ef7d-471d-a641-1ab9fb5ca647-kube-api-access-tzjnq\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.849616 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.849604 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-podres\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.849783 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.849645 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-sys\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.950741 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.950694 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-podres\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.950848 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.950756 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-sys\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.950848 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.950816 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-proc\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.950848 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.950827 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-sys\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.950848 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.950836 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-lib-modules\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.951043 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.950853 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzjnq\" (UniqueName: \"kubernetes.io/projected/05e1f532-ef7d-471d-a641-1ab9fb5ca647-kube-api-access-tzjnq\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.951043 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.950878 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-podres\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.951043 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.950888 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-proc\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.951043 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.951015 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/05e1f532-ef7d-471d-a641-1ab9fb5ca647-lib-modules\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:16.960070 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:16.960048 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzjnq\" (UniqueName: \"kubernetes.io/projected/05e1f532-ef7d-471d-a641-1ab9fb5ca647-kube-api-access-tzjnq\") pod \"perf-node-gather-daemonset-mmmck\" (UID: \"05e1f532-ef7d-471d-a641-1ab9fb5ca647\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:17.094441 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.094376 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:17.218104 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.218078 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck"] Apr 20 15:03:17.219183 ip-10-0-139-136 kubenswrapper[2581]: W0420 15:03:17.219157 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod05e1f532_ef7d_471d_a641_1ab9fb5ca647.slice/crio-44171f91dddc47c08602d2cfabc191695e2d8744b7db43479a33db76cb88f6d1 WatchSource:0}: Error finding container 44171f91dddc47c08602d2cfabc191695e2d8744b7db43479a33db76cb88f6d1: Status 404 returned error can't find the container with id 44171f91dddc47c08602d2cfabc191695e2d8744b7db43479a33db76cb88f6d1 Apr 20 15:03:17.221023 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.221001 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:03:17.238828 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.238807 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/2.log" Apr 20 15:03:17.244737 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.244702 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2hlrs_8651cc3c-2b74-4c3e-bd18-5c4883a6e3e5/console-operator/3.log" Apr 20 15:03:17.705666 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.705620 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" event={"ID":"05e1f532-ef7d-471d-a641-1ab9fb5ca647","Type":"ContainerStarted","Data":"b3a56f9d0e65bc2f68e6f7b8b0dc72b0f7ddb20dc44c121674e00a4339088bff"} Apr 20 15:03:17.705666 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.705664 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" event={"ID":"05e1f532-ef7d-471d-a641-1ab9fb5ca647","Type":"ContainerStarted","Data":"44171f91dddc47c08602d2cfabc191695e2d8744b7db43479a33db76cb88f6d1"} Apr 20 15:03:17.705941 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.705683 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:17.716014 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.715996 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5ff44ddc76-xg9tb_6ab27e3b-7a3c-4245-a3f5-4de1786eafa2/console/0.log" Apr 20 15:03:17.722061 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.722005 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" podStartSLOduration=1.721989723 podStartE2EDuration="1.721989723s" podCreationTimestamp="2026-04-20 15:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:03:17.720294143 +0000 UTC m=+2177.899782611" watchObservedRunningTime="2026-04-20 15:03:17.721989723 +0000 UTC m=+2177.901478192" Apr 20 15:03:17.758447 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:17.758429 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rqnwg_274e2639-2488-4c0c-a9f2-129da2913092/download-server/0.log" Apr 20 15:03:18.990473 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:18.990449 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5qgnp_ce3a0704-031d-43e8-87ac-e53039d6f376/dns/0.log" Apr 20 15:03:19.013520 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:19.013501 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5qgnp_ce3a0704-031d-43e8-87ac-e53039d6f376/kube-rbac-proxy/0.log" Apr 20 15:03:19.153564 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:19.153535 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lmkks_4828e4f0-6105-42f8-8ec4-54d66f7d101d/dns-node-resolver/0.log" Apr 20 15:03:19.741878 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:19.741850 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d997f6dc7-kx4pp_5a768c89-d5ac-4506-9349-6e193e93c0e4/registry/0.log" Apr 20 15:03:19.815804 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:19.815778 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q5vqc_9d52b0cc-f4c2-4ce7-a3ab-7dce264fc27c/node-ca/0.log" Apr 20 15:03:20.853799 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:20.853768 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-9zc9q_6b1bb7fc-bf8e-4cc9-b695-286dac851053/discovery/0.log" Apr 20 15:03:20.902429 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:20.902405 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-87db58fcf-kth4n_cc9fc9e7-8226-482c-a827-42039d4cc2b3/kube-auth-proxy/0.log" Apr 20 15:03:20.989204 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:20.989178 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7748fc9578-6ldxb_9e88f589-217b-4f8f-a0a4-7289dd42caff/router/0.log" Apr 20 15:03:21.605343 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:21.605307 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-htgwf_e5f2a8a7-39d4-41d8-9ca2-1a049023a466/serve-healthcheck-canary/0.log" Apr 20 15:03:22.030789 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:22.030762 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-6hcxh_a67564e3-e8db-4d6a-a8a4-591a0e2cf642/insights-operator/0.log" Apr 20 15:03:22.031571 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:22.031557 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-6hcxh_a67564e3-e8db-4d6a-a8a4-591a0e2cf642/insights-operator/1.log" Apr 20 15:03:22.343532 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:22.343466 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v88gx_c07adcb0-9da3-4948-b0cc-5a16508f6526/kube-rbac-proxy/0.log" Apr 20 15:03:22.368621 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:22.368601 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v88gx_c07adcb0-9da3-4948-b0cc-5a16508f6526/exporter/0.log" Apr 20 15:03:22.394555 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:22.394533 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-v88gx_c07adcb0-9da3-4948-b0cc-5a16508f6526/extractor/0.log" Apr 20 15:03:23.721463 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:23.721433 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-mmmck" Apr 20 15:03:24.402773 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:24.402739 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-bs64c_b9ad245f-c49d-4e9e-9e02-d7e5dc0e8c41/manager/0.log" Apr 20 15:03:24.436961 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:24.436935 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-79fdf5ff45-cw6jv_76083fc3-97c2-4b5a-be5c-a471f542b4f8/maas-api/0.log" Apr 20 15:03:24.512512 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:24.512485 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-nkhrj_ba024e5e-b9fa-4ab8-bb53-1fe317a59889/manager/0.log" Apr 20 15:03:24.524000 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:24.523973 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-nkhrj_ba024e5e-b9fa-4ab8-bb53-1fe317a59889/manager/1.log" Apr 20 15:03:24.624278 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:24.624253 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65c545df94-ssz28_b65f99ff-f35b-4eb0-bb2f-54ff0e147fab/manager/0.log" Apr 20 15:03:24.715963 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:24.715946 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-5kr8w_f42738c5-6c35-4d09-aa1d-ae5ef4aae006/postgres/0.log" Apr 20 15:03:30.517420 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:30.517379 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-58sfw_cca82e87-2fd5-4157-9e9f-c2ea4ea55866/migrator/0.log" Apr 20 15:03:30.543859 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:30.543827 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-58sfw_cca82e87-2fd5-4157-9e9f-c2ea4ea55866/graceful-termination/0.log" Apr 20 15:03:32.178529 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.178499 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ddt9k_330b3bed-9410-44fc-9f4d-401c49180ff9/kube-multus-additional-cni-plugins/0.log" Apr 20 15:03:32.202078 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.202050 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ddt9k_330b3bed-9410-44fc-9f4d-401c49180ff9/egress-router-binary-copy/0.log" Apr 20 15:03:32.222530 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.222506 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ddt9k_330b3bed-9410-44fc-9f4d-401c49180ff9/cni-plugins/0.log" Apr 20 15:03:32.242129 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.242107 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ddt9k_330b3bed-9410-44fc-9f4d-401c49180ff9/bond-cni-plugin/0.log" Apr 20 15:03:32.263177 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.263156 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ddt9k_330b3bed-9410-44fc-9f4d-401c49180ff9/routeoverride-cni/0.log" Apr 20 15:03:32.286241 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.286221 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ddt9k_330b3bed-9410-44fc-9f4d-401c49180ff9/whereabouts-cni-bincopy/0.log" Apr 20 15:03:32.306958 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.306934 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ddt9k_330b3bed-9410-44fc-9f4d-401c49180ff9/whereabouts-cni/0.log" Apr 20 15:03:32.531735 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.531695 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jhhn2_97b3ed92-b422-4a1d-bd36-49e50a7f088d/kube-multus/0.log" Apr 20 15:03:32.653703 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.653675 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-t787k_5b3c9c26-01c0-40b2-ba38-e4b72ba81f66/network-metrics-daemon/0.log" Apr 20 15:03:32.675875 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:32.675851 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-t787k_5b3c9c26-01c0-40b2-ba38-e4b72ba81f66/kube-rbac-proxy/0.log" Apr 20 15:03:33.583472 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:33.583445 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-controller/0.log" Apr 20 15:03:33.610588 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:33.610567 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/0.log" Apr 20 15:03:33.619249 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:33.619225 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/ovn-acl-logging/1.log" Apr 20 15:03:33.644319 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:33.644298 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/kube-rbac-proxy-node/0.log" Apr 20 15:03:33.668933 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:33.668914 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:03:33.689436 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:33.689412 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/northd/0.log" Apr 20 15:03:33.710119 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:33.710076 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/nbdb/0.log" Apr 20 15:03:33.730907 ip-10-0-139-136 kubenswrapper[2581]: I0420 15:03:33.730886 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-r8xsm_a5549a94-73ae-4c4d-a853-281d46a86d49/sbdb/0.log"